First published online 29th March 2005
As data sets get larger and software systems more complex and powerful, data owners are being forced further and further away from their data.
Whether you hold data as an information resource, for marketing, CRM (Customer Relationship Marketing), database marketing, direct marketing or any other purposes, the success of your projects and your ability to comply with the Sarbanes-Oxley corporate governance act depend on the quality of your data.
Integrated data systems and software layers on top of your data often hide both the data and that data's quality problems, and often do not resolve the data quality problems. Technical aspects of large data projects, such as the use of SQL, conspire to isolate data from the user. Very few people now have the luxury of being able to browse through their data to allow their brains, the best data quality tool that exists, to work on that data.
This is bad news for data quality. My own analyses of international address data, even after cleaning, shows that sometimes more than 50% of data can be incorrect. If a company is analysing its sales, for example, on that basis, they could be making disastous business decisions.
Data and information quality cannot be assured simply by adding new system or software layers. The distance between the data owner and their data needs to be reduced again. A few moments of their time with their data can save enormous amounts of time and money in data quality requirements, and show immediately the data quality issues extant within that data.
(c) 2005 Graham Rhind. Reproduction only allowed with permission. Comment and dialogue welcome.
http://www.grcdi.nl
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment