Data Quality: Is It Up to IT Alone?
Data quality is a priority across all industries trying to harness best practices, yet most companies don't even have initiatives in place, according to Ted Friedman, vice president of research at Gartner. During his Tuesday keynote, "Data Quality Trends and Best Practices," at Firstlogic's iSummit LIVE in New York Freidman also said that many IT professionals know their companies have a data quality problem, but that they believe others within their organization assume the problem is for IT to solve. "This is a business issue, this is not an IT issue, and the only way you're going to be successful is if you put some accountability on business and engage the business side of the house," Friedman said. "The typical rhetoric you hear is 'Can't we just throw some technology at it?' But data is a viable business asset. You can't do it all with technology--that's the key point. You're trying to make this leap across the chasm. We've made some progress to get some movement going to make this leap, but there's too much denial... the push [is from] the IT side, rather than...from the business side, which is what you want," Friedman said.
A simple, significant starting point for companies is to define what data quality means--anything from the validity of data to structure consistency to things that generally cannot be assessed, such as how believable it is. Realizing that people are the biggest negative influence on data quality success is also important, according to Friedman. "People don't treat data as an asset, they treat it as a necessary evil. But you can do something to change the culture, use examples of success stories, deploy controls. Simply by beginning to measure, things will, magically, get better."
What to measure and where to start are the questions. Friedman suggested developing a rating scheme: It's a good way to report to the business side, alerting C-level execs that the organization's data quality level is at a C, or at 70 percent, so they know where the company stands. They also need to know the value of that quality gap, to help them figure out how much the business is losing. Freidman asked, "Are you losing money? Are you losing customers? This is the process by which you begin to build [and put] the data quality process in place."
One of the most common mistakes businesses make is the one-and-done approach to data quality, according to Friedman. Not only is it something that should be monitored all the time, but it should also be monitored in all places. It is extremely important to have a data quality "firewall" to keep out data that might damage internal information (this can be done mostly through technology). Also, what is valuable data at one stage of the process might not be enough farther down the road. "Data flows and what might be good enough for someone upstream may not be good enough for people downstream. Embed data quality controls in a very pervasive way," Friedman said. He also suggested putting data stewards in charge of key chunks of information, and empowering them to make changes and perhaps compensating them for their successes.
"I fear many organizations are going to fall into the water and get eaten by the sharks if they're not taking time to understand how important data quality is."
Related articles:
What Data Quality Means to CRM Practitioners
Data Stewards Define Data Quality--Who's in Charge Here?
Related Articles
Data Quality Best Practices Boost Revenue by 66 Percent
22 Jan 2009
Research by SiriusDecisions also shows that faulty marketing intelligence has a comparably huge negative effect.
Beyond BI at BetterManagement Live
27 Oct 2005
SAS's assessment service offering, the Information Evolution Model, identifies five levels of information evolution to help companies gain competitive advantage.
Buyer's Guide Companies Mentioned