They call this the age of “big data,” but sometimes it seems an awful lot like the age of “bad data.”
According to a recent study by data scientists at IBM, one in three business leaders do not trust the data they use to make decisions. Further, the study claims that the annual cost of poor data quality is $3.1 trillion in the U.S. alone.
Poor data quality isn’t just an enterprise IT problem. It affects organizations of all shapes and sizes. By undermining fundamental decision-making processes, poor data impacts all business operations — from sales and marketing to budgeting and distribution. A big part of the problem is that we’re being inundated with data but haven’t developed the processes or organizational structure to properly verify its accuracy or value.
Research commissioned by Veritas Technologies finds that data stores are growing by as much as 65 percent annually, but a good chunk of this data is just wasting valuable time and resources. In fact, the report finds that 85 percent of all data being processed and stored is functionally useless.
The report claims 52 percent is considered “dark” data, which means its value is unknown. Another one-third is considered redundant, obsolete or trivial (ROT), which means it is known to have zero value. Another study finds that 41 percent of data hasn’t been accessed or modified for three years.
These studies illustrate the dangers of the “save everything” mentality that most organizations take with their data. They tend to hoard data even though they have no visibility into what they’re storing or the data’s value. That is an expensive strategy, considering the cost to store 1000TB of non-critical data is estimated to be more than $650,000 per year.
Storing and managing data that has questionable or no value not only wastes money and resources, but also makes it more difficult to manage data that does have value. This makes organizations less responsive, slows decision-making, and increases risks related to security, regulatory compliance and e-discovery. The problem is only getting worse with the development of mobile, cloud and big data initiatives leading to an explosion in the number of data sources.
For these reasons, it is imperative that business of all sizes develop a comprehensive data governance framework to ensure the availability, accuracy, integrity and security of their information assets. It is important to understand up front that this involves more than installing some software and watching it work.
A sound data governance program requires a framework for identifying who owns and is accountable for data assets. The program should include procedures for storing, archiving, backing up and securing data. It should also clarify who should have access to data and for what purpose, and ensure that compliance requirements are being met.
Working with a managed services provider (MSP) can prove beneficial for organizations looking to establish a data governance framework. MSPs often have expertise with data management, access controls and regulatory compliance requirements. A third-party provider also brings an objective perspective, and can help develop a strategic approach that cuts across organizational boundaries.