Big Data: How companies make their data talk
6:12

It’s a horror scenario for any company: losing one your highest-revenue customers. The management wants to know how the loss will affect the operating result. The consequences for turnover, inventory, material and personnel costs also need to be identified as quickly as possible. In theory, this is the time for management control. In practice, however, this is often hopelessly overwhelmed. In his article, enomyc author Peter Kink describes how companies can quickly gain an overview with targeted mass data analysis. And why, despite everything, corporate management remains a managerial domain.

Once the initial panic triggered by the news of losing a major customer has subsided, this is usually what happens in many medium-sized companies: With great effort, key figures such as incoming orders or the order backlog of the remaining customers are compared with the existing costs in absolute and relative terms with past data and planning. Lists are generated from the ERP system, which employees analyse using complex formulas. After what feels like an eternity, the management is presented with the results. What nobody knows: Are the figures even valid? And if so, what conclusions can be drawn from them?

In such a situation, an analysis of mass data can provide valuable assistance – provided it is used and implemented correctly.

ERP systems rarely provide the information you are looking for

Most companies today work with ERP systems with different modules. However, the analysis options of these systems are still limited. Even an add-on does not necessarily deliver the desired result for a specific case , especially if the analysis requires daily updated master data in the ERP system.

In addition, the formulas usually stored in Microsoft Excel must be technically flawless. Both nested if-functions and matrix functions can quickly lead to technical problems.

Another reason why it is becoming increasingly difficult to analyse mass data is the growing complexity of all company processes. Increasing regulatory requirements or historically evolved IT landscapes with numerous systems and interfaces make it difficult to analyse large volumes of data.

This makes it all the more important to be able to analyse mass data as efficiently as possible and in real time in order to make important tactical and strategic business decisions on this basis.

The clearer the goal, the more meaningful the result

In mass data analyses, a cluster of data is aggregated according to certain criteria or simply equalised in order to obtain a better overview of a situation. As a rule, the data is available in the form of simply structured lists that are aggregated to a higher level for this purpose.

In this way, for example, the order backlog can be extrapolated to months or years at daily level, sales can be displayed not only per order but also at customer level, throughput times of production steps can be aggregated at machine level or current assets can be analysed by age of stock. To put it simply, an informative and easy-to-read overview with just a few rows and columns is generated from a confusing table with possibly hundreds of thousands of records, which shows both absolute and relative values and can even depict trends.

Valid data provides the necessary basis for assumptions

Thanks to its expertise and professional experience, good management has the most important company key figures at its fingertips, even without mass data evaluation. This in-depth understanding of the company can be verified and validated using the analysis options outlined above and even supplemented with information that is not yet available, such as trends or insights into areas of the company that were not even considered in previous business analyses. This is precisely the strength of good mass data analysis.

One misconception, but unfortunately common practice, is the assumption that mass data analyses can be used to identify patterns or derive specific measures – true to the motto: “Let’s just analyse it for now. Something’s bound to come out of it”.

Before mass data is analysed, the aim and purpose of the analysis should be known and clearly formulated. Is it an analysis for strategic objectives? And who is the target group for the analyses? What the result needs to look like should also be defined in advance. For example, the result of a comprehensive analysis can be a bar chart that shows a time series horizontally and specific values vertically. Or a map showing the distribution of customer sales, gross profits or results.

Excel dominates, but has its limits

The most frequently used tool for short-term analyses is still Microsoft Excel. Theoretically, one million records (= one row = one record) can be recorded in a simple table and then summarised into a clear matrix using the PIVOT function. However, with this amount of data, crashes must be taken into account due to the high computing power required.

In addition, there are analyses such as reading machine data or movement data in logistics, which may comprise well in excess of one million records. In such cases, tools such as Qlick, Power BI, Tableau, IBM Cognos, SAP BusinessObjects, Looker or Domo, which specialise in such applications and can evaluate large amounts of data in real time, are ideal. These tools can also be set up as a “cockpit” and used permanently to read out key figures “at the touch of a button” for corporate management purposes.

A qualified consultant can provide useful support in precisely this situation – with expertise in all areas of the company, especially in operations and finance. Thanks to their high level of methodological expertise, they should also be able to analyse extensive amounts of data and provide the management with recommendations for further action in the form of an action plan. Important points such as annual cost or revenue effects should already be identified for each measure in the implementation plans.

Get industry insights now!