LAS VEGAS — As part of its 2008 Ideas conference here Tuesday, data quality solutions provider DataFlux unveiled a new data management platform built from a technology partnership with its parent company, SAS Institute. "Project Unity," explained Tony Fisher, president and chief executive officer of DataFlux, will be, as the name suggests, a single platform to handle all aspects of data management. Similar to how the mechanical clock revolutionized the world by helping its population get in sync, he told attendees, Project Unity aims to give the enterprise a single frame of reference around data. It's the absence of that synchronization, he said, that plagues nearly every existing corporation.
According DataFlux, Project Unity meshes SAS's industry-tested technology with DataFlux's offerings to address multiple concerns around data management, some of which include:
- data quality;
- data integration;
- master data management;
- data federation;
- business process integration;
- unstructured data support; and
- application integration.
The technology, however, is only one leg of the tripod that supports an enterprisewide data management initiative. In fact, here at DataFlux Ideas, technology seemed to take more of a behind-the-scenes role compared to the people and process sides. Fisher himself emphasized the importance of having a culture that breaks down the walls dividing the technology from the business, and while DataFlux organizers reported that the audience had a 65/35 lean toward technology professionals, that was a notable shift from previous gatherings. The wholly tech-focused discussions of the past have given way to more "business-user-friendly" solutions. But it's not a question of a pendulum swinging to the opposite extreme, Fisher told the audience. No longer should data management be in the hands of either the information technology staff or the line-of-business staff; rather, he said, the only way the methodology can ever succeed is if the two departments work together.
The process component requires companies to maintain corporate focus on managing data according to a "quality culture." Fisher cited one particularly severe example, a company that posts a weekly "Wall of Shame" listing the names of employees who fail to abide by its data management rules. A good process in assuring data quality, he said, follows this general evolution:
- discovery and exploration of data;
- design and modeling of data;
- enablement and use of data;
- maintenance of data; and
- archiving of obsolete or irrelevant data.
In his Tuesday-morning keynote presentation, "The Data Integration and Data Quality Imperative: Delivering Information Your Organization Can Trust," Ted Friedman, a vice president at Gartner, echoed Fisher's message, accusing companies of becoming too enamored of the joys of what he called "plumbing" -- that is, merely moving data from one place to another. Unfortunately, he said, "there is no business value in efficiently distributing garbage."
What may get lost in the mix, he added, is the understanding that while companies may excel at the first half of data management -- the data itself -- many fall short when it comes to the second half: the actual management of that data. "In an organization, nothing happens until we give the user valuable information at the right place, the right time, and [of] the right quality," he said. "Nothing happens until someone sells something."
While tools and technology are certainly key enablers to data management, it takes a business-oriented mind to maximize these tools' potential. Still, the fundamental competency that businesses need to develop is organizational and managerial discipline, Friedman said. According to Gartner, 70 percent of companies report having no standards around how data integration and data quality are performed across the business organization. Consequently, suborganizations feel free to pursue whatever data efforts they see fit, making it increasingly difficult to maintain consistency and leverage and reuse data in a way that promotes efficiency. This set of circumstances fosters what Friedman called a "brittle" environment -- one that will make it difficult for a company to compete in today's market, where rapid change and advancement favors the agile. "If you think that it's painful now, it's only going to get worse," he warned the crowd.
Making matters worse, the playing field is about to require a new level of sophistication. Business competency, Friedman predicted, will soon begin to rely on metadata management -- the handling of higher-level, multifaceted information that provides context around existing data, such as location, format, structure, quality, and meaning. That, in turn, is part of the industry's move toward what Friedman identified as semantic technology. "[Semantic technology] is about making meaning more explicit and understandable at the abstract level," he said. Focusing on metadata elevates and expands the concept of data management from a narrow, technology-centric objective to one that comprises the business-process perspective. For the companies that can successfully make that transition, Friedman promised, the revelations that await will translate into real business value.
News relevant to the customer relationship management industry is posted several times a day on destinationCRM.com, in addition to the news section Insight that appears every month in the pages of CRM magazine. You may leave a public comment regarding this article by clicking on "Comments" at the top; to contact the editors, please email editor@destinationCRM.com.