Measuring data quality - 3 tools for evaluation

10.10.2022
Master data management
An article by:
Romana White

Poor data quality is a constant concern for companies. The topic comes up again and again, but there is a lack of valid data. Three tools help to measure costs and data quality.

The topic of data quality in master data management is a bit like "The Groundhog Day". Much like Phil Conners, played by Bill Murray, you get the feeling of being stuck in a time warp and figuratively reliving the same day over and over again. The importance of data quality and what companies should do for it is emphasized again and again, while in the same breath the widespread inadequate to poor data quality is denounced. It cannot be said that no work is being done on this topic. A lot is already happening, but the impression remains that the importance of data quality has not yet really arrived where it should - especially not at the executive level.

Costs of poor data quality

This could be due to the fact that many companies fail to make the costs of poor data quality transparent. According to the 2017 Gartner Data Quality Market Survey, nearly 60 percent of organizations do not measure the annual financial cost of poor quality data. "Failing to measure this impact leads to reactive responses to data quality issues, missed business growth opportunities, increased risk and lower ROI," says Mei Yang Selvage, Research Director at Gartner.

"Leading information-driven organizations actively measure the value of their information assets as well as the cost of poor quality data and the value of good quality data," Selvage continues. "Most importantly, they link this directly to key business performance metrics." Poor data quality hits organizations where it hurts - the money. According to the Gartner survey, the average annual financial cost per company is 15 million US dollars. These are the direct costs. But companies are not only affected financially. Poor data quality practices undermine digital initiatives, weaken their competitiveness and sow distrust among customers, emphasizes Salvage.

Want more facts? Gladly! Thomas C. Redman, founder of Data Quality Solutions and known in the community as "the Data Doc", estimates that the cost of bad data for most companies is 15 to 25 percent of turnover. This estimate from the end of November 2017 was not published just anywhere, but in MIT's Sloan Management Review. It is based on studies by Experian (bad data costs companies 23 percent of turnover worldwide) and the consultants James Price from Experience Matters (20,000 US dollars per employee for bad data) and Martin Spratt from Clear Strategic IT Partners (16 to 32 percent unnecessary expenditure on data).

The total cost to the US economy: an estimated 3.1 trillion US dollars per year, according to IBM. That's a lot of money. And the costs incurred by companies due to angry customers and bad decisions are not even measurable - but they are enormous in any case. So much for the bad news. The good news is that, according to Redman, an estimated two thirds of measurable costs can be identified and permanently eliminated.

One might argue that these are primarily figures from the USA and that the situation in Germany - in terms of data quality - is much better. Unfortunately, the author cannot confirm this from his own many years of experience. Moreover, all known empirical studies on data quality in Germany point in the same direction: there is an urgent need for action.

To help you make data quality visible and measurable in the future, we present three methods for analyzing data quality and evaluating follow-up costs. The first two methods are a manual analysis, the third is a software-based solution for data in SAP systems.

  1. Status survey: the "Friday Afternoon" measurement method
  2. Showing costs: the "rule of ten" measurement method
  3. Comprehensive analysis: the zetVisions Data Quality Analyzer (DQA)

1. status survey: the "Friday Afternoon" measurement method

If you want to get to the bottom of the problem in your own company, we recommend the "Friday Afternoon" measurement method developed by Thomas Redman ("Friday Afternoon" because the method can be applied in an hour or two on a quiet Friday afternoon without much effort). The first step is to collect the data records of the last 100 work units completed in a group or department. The group processes purchase orders, then the last 100 orders and so on. Then ten to 15 critical data attributes are defined, i.e. attributes that must be complete, error-free and consistent in order to be able to do anything with the data records.

Everything is entered into a table containing 100 rows for the data records and 10 to 15 columns for the data attributes. In the next step, a small group of people go through the data record by record and color-code the obvious errors in the corresponding table fields. At the end, it is recorded for each data record whether it is perfect (no color markings) or not (markings present), and the sum of the perfect data records is calculated. This number, which can be between 0 and 100, represents the percentage of correctly created data.

Redman, together with Tadhg Nagle and David Sammon, conducted this survey over a two-year period with 75 executives from various companies and industries, government agencies and departments such as customer service, product development and human resources. The results were staggering: only 3 percent of the records fell into the "acceptable" error range (at least 97 correct records out of 100 were considered "acceptable").

Almost 50 percent of the newly created data sets contained at least one critical error. This also showed that no industry, government agency or department is immune to the distortions of poor data quality. As a result, it can be stated: Data quality is in much worse shape than most executives realize. If you don't have clear evidence to the contrary, you have to assume that your own data is no better, warns Redman.

A visualization makes it clearer:

2. show costs: the "rule of ten" measurement method

The "Friday Afternoon" measurement method provides certainty. And to illustrate the monetary consequences of your own findings, the so-called "rule of ten" is used. Of course, the costs cannot be measured exactly, but the rule of ten, according to Redman, helps to at least get a realistic idea of them.

The rule states that "it costs ten times as much to complete a unit of work if the data is incorrect in any way than if it is perfect". As an example, let's take the group that processes purchase orders again and let's say they have to process 100 of them per day. It costs 5 euros to process one purchase order if the data is perfect. If the data for all 100 purchase orders is perfect, the total daily cost is 100 x 5 euros = 500 euros. If only 82 purchase orders have perfect data records, but 18 have errors, then the calculation is as follows: (82 x 5 euros) + (18 x 50 euros) = 410 euros + 900 euros = 1310 euros. An increase in costs of 162 percent.

The rule of ten does not take into account non-monetary costs such as lost customers, poor decisions or reputational damage.

3. comprehensive analysis: the zetVisions Data Quality Analyzer (DQA)

The zetVisions Data Quality Analyzer (DQA) is a software-based solution for obtaining a reliable statement about the current status of data quality in SAP systems. Companies can get a reliable picture of how good or bad their data quality is. The "perceived data quality" can be verified and backed up with facts.

Once DQA has been installed on the SAP system, the individual master data domains, such as customer/supplier or business partner, product and material master data, are inspected. DQA scans the data for inconsistent, duplicate, incomplete and outdated data records according to individually defined parameters.

KPIs are used to rate the data records as good, poor or sufficient. The DQA checks these defined rule sets at fixed intervals and visualizes the improvement trend on a dashboard. When a data record with deficiencies is detected, a correction task is automatically triggered and sent to the responsible employees by email.

Before purchasing master data management software, management and stakeholders must be convinced of the necessity of this investment - no easy task. A fact-based understanding of data quality supports the proposal or even the decision to purchase software.

The inevitable upcoming conversion of all SAP ERP systems to SAP S/4Hana poses challenges for many companies. Poor data quality will lead to delays in the migration. Rising costs and disrupted business processes are inevitable. Improving data quality should therefore be at the top of the agenda today to ensure that the changeover to SAP S/4Hana runs smoothly.

Webinars

Get to know our data management solutions

In our 30-minute webinars, you will learn about our solutions for your investment and master data management - compact, interactive and free of charge.

And this is how it works: Choose a date that suits you. Please register at least two days before the respective date. We will then send you all the information and links you need to take part.

A complete overview of complex structures

legal entity management with zetVisions

All strategic, operational and administrative tasks of a corporate group come together in investment management. Our data management solutions turn these duties into opportunities for your company.

13/06/2025
News
Following our successful certification last year, we have now reached the next milestone: we have successfully passed the transition audit and are now officially certified in accordance with the current ISO/IEC 27001:2022 standard.
26/05/2025
News
Through integration with SAP S/4HANA Enterprise Management, SAP Business Technology Platform, SAP Build Work Zone, SAP ERP and SAP S/4HANA Cloud Private Edition, the zetVisions CIM solution from zetVisions GmbH offers customers a software solution for participation management.
15/05/2025
Master data management
Data management ensures clean master and reference data that is seamlessly integrated internally into methodologies, processes, workflows and platforms.
15/05/2025
Master data management
Data quality management is essential for a constantly up-to-date and reliable master data basis for the entire company.
15/05/2025
legal entity management
AMI hosting by zetVisions cuts costs and reduces friction losses that usually arise when using an IT service provider.
14/05/2025
Master data management
Data quality in master data management is the key to a company's success. Correct data shapes decision-making and increases efficiency.

Get in touch with us

* Mandatory field

This field is used for validation and should not be changed.

Register for the event now

* Mandatory field

This field is used for validation and should not be changed.