TTE Strategy named TOP CONSULTANT 2021

Close

Author
Wiebke Apitzsch
Wiebke
Apitzsch

From an interpersonal perspective: these are the six most common mistakes made in (big) data initiatives

12.07.2021
PostImage

Despite the fact that big data has been the subject of discussion for almost a decade now, many companies in Germany struggle with the introduction and application of data analyses. Generally, this is not so much down to technical conditions or a lack of expertise. Most mistakes are made at a human level. Wiebke Apitzsch, data expert and managing director of management consultancy TTE Strategy, provides an overview of the six biggest fails in the implementation of data initiatives. To do this, she moves away from a purely functional perspective and adopts an interpersonal view.

  1. Technical complexity is overrated – the human factor is underrated

Companies looking to build data expertise for the first time often recruit external experts as project managers. The focus of their recruitment drive is on the greatest possible technical expertise in the field of data management. “And in doing so, they are focusing on the wrong thing,” says Wiebke Apitzsch. “Corporate management must bear in mind the fact that unfamiliar people will bring unfamiliar things to the company. Expertise helps here. But, above all, it is about the ability to approach people, understand their position, and win them over. Those who are unable to build trust will fail in the face of internal structures.” For this reason, the expert advises first scouting in a targeted way for potential candidates whose strengths lie in non-directive management, who possess a high degree of agility, and who already offer change expertise in a technical and/or IT environment. Providers of relevant software generally offer detailed expertise anyway when it comes to data. Apitzsch says: “Many companies conduct their search externally despite already having the right candidate in the company. Identifying this person is the key to a successful data introduction.”

  1. The data system is established using pressure – so no followers will be recruited

Data initiatives often come from corporate management – frequently cheered on by finance and accounting, who want better management figures. Wiebke Apitzsch says: “If this is discourteously packaged, the data initiative can feel like a new means of control from above. A number of brightly colored dashboards will be provided, but their significance will be rather low.” The reason for this: if management such as divisional or plant managers are not sufficiently involved in the process, they will continue to find reasons later for the values in the dashboard not reflecting reality. They can always boycott activities based on a data initiative using detailed knowledge of their field. “I have frequently witnessed management attempting to combat this with pressure. Which leads managers to impede making the data usable in ever more ‘creative’ ways. In the worst-case scenario, they feel threatened in their leadership authority by the changes and become intransigent.” What companies should do differently: “Big data needs followers,” says Wiebke Apitzsch. All relevant managers should be included in the initiative at an early stage. It must be clearly established how the new data initiative benefits those who generate data first and foremost. Apitzsch: “If I explain to plant managers how they can use the tools to significantly reduce downtimes via the new data management system in future, it becomes interesting for them. If they and their teams have control over these data, can test things carefully, and see success, this will build trust. This forms a basis that can be built on. If I were to explain that I want to measure the performance of their plant in future – then the opposite can be expected.”

  1. The right questions are not being asked – because of misplaced humility in the face of data experts

“You may have witnessed this too: a data team presents its agenda and the next steps. In the process, it uses an array of technical terms, like ‘APIs’ and ‘data cleansing’, and asks if all that is clear so far. Hardly anyone from management really understands what is being discussed. But questions are not asked. Therefore, questionable initial decisions are made that are almost impossible to undo further down the line,” says Wiebke Apitzsch. “If I question this, it often becomes apparent that even top managers didn’t have the confidence to ask the crucial questions because they felt that they couldn’t keep pace with the experts. This misplaced humility can come at a high price. It is similar to building a house quickly and realizing later that there isn’t a basement.” Apitzsch recommends asking for a definition for each unfamiliar term. And having the confidence to admit if technical correlations have not been understood. “There is absolutely nothing wrong in asking your experts to provide the appropriate translation. In most cases I am aware of, it isn’t about a lack of receptiveness on the part of decision makers if things are not clear. It is generally down to the complicated phrasing of those speaking. On the other hand, the latter couldn’t be expected to know what was unclear if they kept being told it all made sense. My advice: be courageous; if in doubt, ask for correlations to be explained several times. Everyone will thank you if you ask questions.”

  1. If no specific problems are solved with the new data, the benefit of the initiative as a whole will be questioned

Many initiatives begin with the required data points being defined and subsequently collected from above. They are presented on dashboards in meetings. “But they don’t go to those who would really benefit from them,” says Wiebke Apitzsch. Data then become an end in themselves and an initiative that has just launched will quickly be queried from a number of quarters. “The best thing is to speak to those who will need to provide the data – and listen to their gut feeling regarding which data should be collected and what should then be done with them,” says Apitzsch. “In this way, the data will quickly find their way back to an operational application. Plant managers, for example, often have a feeling about where they are lacking relevant information. This should be the starting point. Shared goals should be targeted, thus building motivation for extending data initiatives.”

  1. IT and IT security are integrated too late, if at all – and the new tools no longer fit the system infrastructure

Many data initiatives come from operational business and are implemented with the aid of advisers. Tools for data collection and analysis are then produced there and are piloted in operational use. And, during the next system update, they suddenly stop working or they violate security standards. Wiebke Apitzsch says: “It happens more often than you’d think: because management or individual business divisions wanted to drive their own initiative as quickly as possible, they simply forgot to integrate key internal stakeholders like IT into the process. It isn’t that their own people would hold them back.” The result is always the same: a lot of money has been spent and the initial result is pleasing. Then the IT department takes over, but, despite the best of intentions, it can’t be integrated into the existing system infrastructure. The tools have to be changed or, in the worst-case scenario, redeveloped. “This is why IT and IT security have to have a place at the table right from the very start in a data initiative,” says Apitzsch. “Anyone who believes that they can circumvent their own organization will almost invariably be shown the error of their ways when it comes to IT after a few weeks at the very latest.”

  1. Data don’t lie – actually, they do!

Companies often labor under the misconception that they will understand the whole truth in the future – just because they have set up a data management system. “But that doesn’t help at all if the data are plain wrong,” says Wiebke Apitzsch. “And they are – to a significant degree.” Sensors that are not correctly attached can measure things incorrectly in plants; financial figures are transferred incongruously from the system; data are not entered consistently. But above all, their interpretation is not thought through in detail. For example, data from test runs for new products or during major maintenance work have to be excluded. “In my experience, ensuring the quality of the data is often around three times more laborious than establishing data management processes. Data must be continually scrutinized, requalified, and cleansed. View your data with suspicion. Always. Time and again. Ask the people who work where the data are collected whether they reflect reality. If they agree, you are heading in the right direction. That is my closing advice.”

Get in touch with us!



The Team
Enablers



IMPRINTPRIVACY

// Set to the same value as the web property used on the site var gaProperty = 'UA-62890205-1'; // Disable tracking if the opt-out cookie exists. var disableStr = 'ga-disable-' + gaProperty; if (document.cookie.indexOf(disableStr + '=true') > -1) { window[disableStr] = true; } // Opt-out function function gaOptout() { document.cookie = disableStr + '=true; expires=Thu, 31 Dec 2099 23:59:59 UTC; path=/'; window[disableStr] = true; alert('Google Analytics deactivated.') } (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-62890205-1', 'auto'); ga('send', 'pageview'); //Hubspot var script = document.createElement('script'); script.setAttribute('id','hs-script-loader'); script.setAttribute('src','//js.hs-scripts.com/20086894.js'); document.head.appendChild(script); Privacy Imprint