Visual Analytics for Data-Driven Decision Making

It has been estimated by IBM that 2.5 quintillion bytes of new data are generated on a daily basis. With increased digitalization, constant introduction of new technologies and easier and cheaper access to hard drives and cloud storages, this trend is more than likely to continue. However, simply storing huge amounts of data brings little to no value. That is why organizations need to embrace the right mind-set and utilize proper technologies in order to turn their data into competitive advantage. From Raw Data to Actionable Insights.


1. Understanding data

Oxford Dictionary defines data as facts collected together for reference or analysis. As the amount of available data is growing continuously, the term big data has emerged to describe large quantities of data. It is defined by a set of attributes referred to as the four Vs:

  • Volume – the size of data;
  • Velocity – the speed at which new data is being created;
  • Variety – the different sources and formats of data;
  • Veracity – the accuracy and trustworthiness of data.

The increasing complexity of the four Vs is surpassing the humans’ ability to process data and use it in an effective and meaningful way. In order to extract the right information contained in the data and derive value out of it, it is no longer enough to use standard tools that generate static and predefined reports. This is where visual analytics comes into the picture.

2. Visual analytics

Before exploring visual analytics it is important to understand the purpose of analytics in general: This is the process of using data to make informed decisions. Unlike reporting, analytics does not simply provide a summary of the data. While reporting can state the facts – e.g. there is a material price increase –, analytics can answer the question “Why is there a material price increase?” and thus propel informed, data-driven decision making.

As a result of the challenges related to processing and the proper utilization of big data, the field of visual analytics has gained in popularity as it has been recognized as an efficient way to draw insights from large sets of data. Visual analytics “combines automated analysis techniques with interactive visualizations for an effective understanding, reasoning and decision making on the basis of very large and complex data sets” .1 This definition shows that visual analytics has an interdisciplinary character that integrates visualization, automated data analysis and human factors. Furthermore, the process requires an appropriate infrastructure in terms of data and software. Therefore, software solutions for visual analytics should support the following main layers:

  • Data management – provides ways to access, store, cleanse, transform, govern and essentially make available the data that needs to be analysed;
  • Analytics – uses statistical and mathematical techniques to extract information from the data in order to make it useful for the user;
  • Visualization – ensures that the information extracted from the data is presented to the user in an understandable and interactive way.

While many of the software tools on the market provide great data management and analytics functionalities, they do not have sufficient visualization capabilities and only allow limited exploration and interaction with the generated visualizations. Therefore, these tools are not suitable for visual analytics since advanced visualization techniques are essential for enabling the human involvement in the process. The real-time interaction with the system makes it possible for the user to dynamically change the parameters of the analysis and fully utilize human strengths such as perception and cognition in order to discover patterns, trends and other relationships in the data. Moreover, modern tools for visual analytics aim at continuously improving the user experience by implementing advanced algorithms to automatically generate visualizations based on the available data and suggest relevant information for the user to explore further.

Altogether, having the right infrastructure in place enables the user to gain actionable insights, make informed decisions and ultimately turn the vast amount of available data into knowledge that generates value for the company.

3. Challenges

Identifying a suitable software solution that meets the visual analytics requirements is only one of the various challenges toward data-informed decision making. From company politics through data issues to human limitations, some of the main obstacles along the way are discussed as follows.

3.1. Company culture

A data-centred, analytical culture is essential for organizations to succeed. It is important that all employees, regardless of their level in the company, are data literate and use data as part of their jobs. MIT and Emerson University define data literacy as “the ability to read, work with, analyse and argue with data”. 2 While it is true that senior managers do not necessarily need to understand the process of cleaning data or building predictive models, they need to have basic knowledge of analytics, statistics and the related terminology. When presented with an analysis, managers need to feel confident and be able to understand it thoroughly. As decision makers, they should be capable of interpreting the findings from the analysis, evaluating their impact on the organization, identifying flaws and challenging the analysts with the right questions. Furthermore, top managers should lead by example, as a data-driven culture requires a strong top-down approach. They should actively promote transparency and data sharing across the organization, encourage employees to utilize data, ask more questions and back up their decisions with analytics. Leaders need to publicize the value that analytics brings to the company and support related initiatives, e.g. by investing in appropriate tools and offering specialized trainings in order to increase the employees’ data literacy.

3.2. Data quality

Correct data is crucial for a data-driven organization. The insights that users gain from analytics are only as good as the quality of the analysed data. Bad data could be misleading, result in wrong decisions and potentially cost millions. Therefore, there are several requirements that data should meet:

  • Accuracy – the data correctly represents reality;
  • Completeness – there is no missing data;
  • Consistency – the data is in agreement across various sources;
  • Timeliness – the relevant data is up-to-date and available in time for analysis;
  • Uniqueness – there are no duplicated data items;
  • Validity – the data matches the defined requirements and syntax.

Data can be incorrect in countless ways. In any case, when data issues are discovered, it is critical to identify their source and take an action to solve the problem. Moreover, data quality should be continuously checked, maintained and improved. While it is a good practice for companies to have data governance and data management teams and put data quality assurance procedures in place, in a data-driven organization everyone should treat data as a strategic resource and share the responsibility of ensuring high data standards.

3.3. Human limitations

Having reliable data and implementing stateof-the-art tools still does not guarantee right decisions. Turning data into insight and solving complex business problems requires human involvement and perspective that extends beyond technical skills. As explained in Section 2., human interaction is also an integral part of the visual analytics process. Unfortunately, humans are far from perfect. Two people working with the same set of data could come to different conclusions. It is also possible that, influenced by factors such as mood or tiredness, the same person makes different decisions when analysing the same set of data at different times. Furthermore, it might happen as well that a person has a hypothesis about the outcome of the analysis and only uses data that will confirm that hypothesis. People could also have a strong belief about a subject matter and in many cases they would hold on to it even if presented with information that contradicts their opinion.

To overcome these issues, every analysis should be approached in an objective and unbiased way. Communication and collaboration should be encouraged in order to confirm results and reach mutual understanding. Emotions should be taken out of the process. Furthermore, in order to avoid bias, decision makers need to be self-aware, think critically and challenge their own views. They should constantly seek for more information and concentrate not only on internal data but also explore external sources that could potentially influence their decisions.

4. Visual Analytics Platform at Borealis

This section covers the implementation of a visual analytics platform at the procurement department of Borealis, one of the world’s largest producers of polyethylene and polypropylene.

4.1. Problem

As an international company that operates in over 120 countries, Borealis generates a significant amount of spend. Initially, the related data was analysed using a standard spreadsheet tool and presented in the form of static reports. There was no easy way to link data originating from different sources, which limited the possibilities for analysis. Moreover, with the growing amount of data and the addition of new sources, simple operations such as lookups took hours to complete. These issues emphasized the need for a solution capable of meeting the increasing requirements.

4.2. Software

It was important not only to combine the vast amount of information generated by different sources but also to make it available in the form of highly interactive dashboards that would allow the users to fully understand and explore the available data. Therefore, the software had to support the three layers of visual analytics solutions – data management, analytics and visualization. After carefully evaluating different tools, Qlik Sense was selected as a suitable solution capable of meeting all of these needs.

4.3. Project approach

The project was conducted in-house by a small team. While there was a general sequential approach (understanding the problem – designing a solution – implementation – support), the visual analytics platform was built with focus on flexibility, continuous improvement and customer satisfaction. This included user involvement in every step of the process and quick response to new requirements. Working in a small project team while constantly collaborating with the internal stakeholders proved to be a successful approach resulting in a smooth implementation process and great user satisfaction. Involving the users early in the project was also beneficial for the high adoption rate of the tool. Furthermore, a minimum amount of training was required due to the software’s intuitive interface and ease of use.

4.4. Functionality

The visual analytics platform brings all relevant information together and makes it fully transparent. Its cloud nature and responsive mobile-ready design make it easily accessible anywhere, anytime, on any device. Qlik Sense provides all procurement employees with a user-friendly interface and allows them to move from static and predefined reports to interactive dashboards that offer the ability to drill down into the data and gain valuable information in a matter of seconds. The platform enables insight into every aspect of the data in a self-service fashion, making it possible for users to slice, dice and examine it from different viewpoints.

Currently, the platform encompasses multiple apps (visual analytics applications) which cover different procurement areas. Some examples include:

  • Spend analysis – provides full insight into the company’s spend and enables managers and buyers to identify savings opportunities and make data-driven decisions. In addition to over 15 internal sources the app also incorporates external data, e.g. market indices, so that users can understand how the material prices are influenced by the market development and use predictive models to make forecasts.
  • Supplier relationship management – focused on the company’s main suppliers. Besides spend information, it also provides additional supplier profiles including strategic potential, performance, interaction model, personalized strategy, etc.
  • Total costs of ownership – highly relevant to one of the Borealis’ business segments, this dashboard analyses the component productivity and the cost structure of the final products.
  • N2P (Need-to-Pay) process – tracks the compliance to external regulations and internal procedures from creation of purchase requisition to invoice payment. The app makes it possible to spot and fix problems at the source, thus vastly improving the performance of the entire department.
  • Controlling – tracks the fixed costs development allowing users to quickly spot deviations from the business plan and conduct a further analysis by drilling down to individual cost centres.

In conclusion, the visual analytics platform drastically improved the way people use data in the procurement organization. It brings considerable value by significantly reducing the time previously spent on reporting and providing control over the entire spend and all internal processes, which allows users to analyse trends, discover hidden relationships, get actionable insights, and make informed business decisions.

In a Nutshell

Standard spreadsheet and reporting tools are no longer able to handle the gradually increasing amount of data and do not provide sufficient analytical capabilities. The use of visual analytics can be beneficial for organizations that seek to better understand their data and gain deeper insights. However, selecting a suitable software tool that aligns with the needs of the company is only one of the challenges. Implementing internal procedures to ensure high data quality and avoiding bias in the analysis are crucial factors in the process. Most importantly, a data-centred, analytical culture driven by strong top-down approach is essential for the success of any initiative toward analytics and data-informed decision making.


Der Artikel ist in CFO aktuell (Heft 2/2019) erschienen. Mehr Infos unter: www.cfoaktuell.at


Anhang

1 Keim et al, Visual Analytics: Definition, Process, and Challenges, in Kerren/Stasko/Fekete/North (eds.), Information Visualization (2008) 157.

2 Bhargava/D’Ignazio, Designing Tools and Activities for Data Literacy Learners (2015) 1.

0 Kommentare

Dein Kommentar

An Diskussion beteiligen?
Hinterlassen Sie uns Ihren Kommentar!

Schreiben Sie einen Kommentar

Ihre E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert