How understanding and trust in data informs business decisions
There is a disconnect around understanding and trust in the data and how it informs business decisions, a Syncsort survey reveals, though most respondents rated their organization’s data quality either as “good” (38%) or “very good” (27%).
Sixty-nine percent of respondents stated their leadership trusts data insights enough to inform business decisions, yet they also said only 14 percent of stakeholders had a very good understanding of the data. Of the 27 percent who reported sub-optimal data quality, 72 percent said it negatively impacted business decisions.
Multiple data sources, governance and volume are top data quality challenges
- The top three challenges companies face when ensuring high quality data are multiple sources of data (70%), applying data governance processes (50%) and volume of data (48%).
- About three quarters (78%) have challenges profiling or applying data quality to large data sets.
- Twenty-nine percent say they have a partial understanding of the data that exists across their organization, while 48 percent say they have a good understanding.
Data profiling tool adoption low, leading to lack of visibility into data attributes
- Fewer than 50 percent of respondents take advantage of a data profiling tool or data catalog.
- Instead, respondents rely on other methods to gain understanding of data, with more than 50 percent using SQL queries and over 40 percent using a BI tool.
- Of those who reported partial, minimal or very little understanding of their data, the top three attributes respondents lacked visibility into were: relationship between data sets (63%), completeness of data (56%) and validation of data against defined rules (56%).
The consequences of poor data quality are wide-ranging, from customer dissatisfaction to barriers to emerging technology adoption
- Of those who reported fair or poor data quality, wasted time was the number one consequence (92%), followed by ineffective business decisions (72%) and customer dissatisfaction (67%).
- Twenty-five percent of respondents who reported sub-optimal data quality say it has prevented their organization from adopting emerging technology and methods (such as AI, ML and blockchain).
- Only 16 percent of respondents are confident they aren’t feeding bad data into AI and ML applications.
- Seventy-three percent are using cloud computing for strategic workloads, but 48 percent of them have partial to no understanding of the data that exists in the cloud. Twenty-two percent rate the quality of their data in the cloud as fair or poor.
“This survey confirms what we’ve been seeing with our customers – that good data simply isn’t good enough anymore,” said Tendü Yoğurtçu, CTO, Syncsort.
“Sub-optimal data quality is a major barrier, especially to the successful, profitable use of artificial intelligence and machine learning. The classic phrase ‘garbage-in, garbage-out’ has long been used to describe the importance of data quality, but it has a multiplier effect with machine learning – first in the historical data used to train the predictive model, and second in the new data used by that model to make future decisions.”