Why Good Data Goes Bad

 
 
By Dennis McCafferty  |  Posted 01-27-2016 Email Print this article Print
 
 
 
 
 
 
 
 
 
 

A minority of top execs express high confidence in their organization's data quality management (DQM), according to a recent survey from Blazent. The resulting report, titled "The State of Enterprise Data Quality: 2016 Perception, Reality and the Future of DQM," indicates that nearly all organizations will continue to see an increase in data volume. That's a good thing, since they believe data initiatives can increase revenues, reduce costs and improve customer satisfaction. But there are too many doubts about data integrity, accuracy and consistency to maximize the value of this asset. Issues with data migration and conversion projects are creating many of the difficulties. But the biggest problems are caused by the employees who conduct data entry. "There is a disconnect between those persons held accountable for data quality and those that are responsible for its capture and use," according to the report. "While the IT department is mainly held accountable, the originators of data (e.g., employees, cross-functional teams, others) are not responsible for data quality upon capture or entry. IT departments are burdened with the task of employing multiple cleansing technologies to compensate. Some of those means are rudimentary and manual in nature, and apparently oblivious to the originators or curators of data … The gap between those held accountable for data quality and those responsible for its capture and use is opaque and problematic. It leads to a lack of empathy between the two constituencies." An estimated 200 C-level, senior IT and key business-decision making execs took part in the research, which was conducted by 451 Research.

 
 
 
 
 
Dennis McCafferty is a freelance writer for Baseline Magazine.

 
 
 
 
 
 

Submit a Comment

Loading Comments...