-->

Your Business is Only as Clean as Your Data

A version of this article first appeared in Customer strategy, a magazine published 10 times a year in London by TBC Research.Through its comprehensive portfolio of magazines, events and research, TBC Research is dedicated to helping senior business professionals make more informed technology decisions.

The premise of one-to-one marketing as a management theory has captured the imagination of the business community to a degree not seen since the '80s, when pundits were advocating the benefits of business process re-engineering and downsizing with missionary zeal. The one-to-one theory holds out the alluring promise of real customer loyalty with seamlessly integrated sales, marketing and services as a means to reach this end. However, for those who can see beyond the hype, there comes a realisation that fundamental questions about the quality of customer data need to be addressed, if one-to-one is going to have the impact its advocates claim it will.

For example, few CRM vendors address the basic question of how an enterprise can use database resources in a manner that is consistent across the enterprise, regardless of which department demands new applications. In day-to-day usage, a marketing database can contain some errors which will not impact on the business. Results may be skewed and filtering might need to be applied. But it will not be earth shattering if there is a ten per cent error rate in the database. Nor will it be a serious issue if the analytics pull in John A Smith or JA Smith at the same address. However, if that same database is used for an Internet application, or if a sales or marketing campaign is linked to service information held at a contact centre, the situation becomes more tricky. Where there are errors someone, somewhere is making decisions based on suspect data, or worse still wasting resource in futile calls or mailings.

Information is power

Where enterprises have data warehousing initiatives, these might help marketing in data mining or campaign planning. However, they are unlikely to assist sales or service because the data warehouse is designed to provide a repository for analysis of past activity. In today's sophisticated world, where marketers require ever increasing amounts of peripheral, unstructured data and lifestyle information, the chances of getting that single view of the customer is poor. There is also evidence that despite enterprises' best efforts, data warehouses are riddled with the kind of inconsistencies that could blow a sales campaign out of the water. For example, there is little point in mailing a customer who is deceased, or offering loan facilities to a person who is being pursued by debt collection agencies. But it happens. And when it does, it tends to blow up in the company's face. Last year a well-known financial services organisation repeatedly sent a credit card to the deceased wife of a man who became so fed up that, at the umpteenth mailing, he went out and used it. The result was a court case, and egg on the face of the company.

This problem strikes at the heart of businesses that plan to move towards automating sales and marketing because it brings into question the validity of information in the underlying databases. If the enterprise believes it has to be customer- centric, then questions around data quality become central to long-term success. It will not be enough to look back at the marketing database and assume data used in those situations can be lifted directly and consolidated into a customer database. Customer management is about memory, what you know about the customer and how that can be used to sell goods and services profitably. The reality is that when viewed from this perspective, most enterprises show distinct signs of memory loss - they cannot remember a customer interaction from one touch point to another.

Much of the work required to bring disparate databases together is usually the domain of extract, transform and load (ETL) vendors. Transaction-based or legacy systems are rarely cleaned up and the only time this tends to happen is when a customer notifies the enterprise that he or she is being double-billed. Data quality specialists argue with some relevance that in the CRM sphere this is not satisfactory. The problem is that most data extraction tools are designed to do just that - extract data and then put it into a common format of some kind that can then be readily loaded. Duplication errors will be removed up to a level of around 90 per cent accuracy. But even those databases designed to achieve a single customer view are rarely squeaky clean and certainly do not meet the quality demands of sales organisations.

There are many vendors involved in cleaning data, but few that have a demonstrable track record for achieving the level of cleanliness that data quality specialists bring. Some of these specialists also take the single customer view a step further, by identifying relationships between customers based on information such as shared surname and address, and extending it to recognise joint bank accounts, e-mail addresses and phone numbers, in order to avoid duplication. By building household or network links the theory is that access can be gained to huge streams of money which is passed on through generations.

With the exception of Hart-Hanke, which owns Trillium, the vendors in this round-up can be categorised as "small". However, each has been around a long time and can show impressive customer references and strong partnerships that carry weight. What hampers them all is that data quality is not a boardroom issue. This means they are usually seen as part of projects that tend to be tactical rather than strategic to the enterprise. This will change as enterprises accept the requirement to become more stringent in their customer-facing implementations, levering them into a position where they can take advantage of the potential that networked views of the customer offer.

Harte-Hanks is well-known in database marketing circles and originally, 70 years ago, built its activities on producing shopping catalogues, which still accounts for much of its revenue in the US.

Its long life is reflected in earnings that showed an increase of 17.8 percent overall to $35.8 million or a healthy 15 percent margin on revenues. It has a market capitalisation of around $1.8 billion and plenty of cash. Trillium has been part of the Harte-Hanks portfolio since the Seventies when the parent company recognised the value of customer data integration. The Trillium division in the UK is restricted to sales and support personnel but has a 24-hour support and consultancy service operating out of its Boston office.

Trillium solves data quality problems using a variety of techniques that are wrapped up in a toolbox style approach. Customers can use the Trillium system in batch or online environments, running the data cleansing operations directly against a swathe of applications. This is achieved through the use of selectable modules that plug directly into applications and/or databases. This allows customers to access existing systems directly rather than having to resort to the costly scrapping and reworking of projects that can be typical in this area.

Functions such as data frequency analysis, word counts, custom recodes, domain range checks, and data length modifications are just some of the techniques that can be selectively deployed in the data conversion process. One of the critical aspects in large systems is the ability to correctly match customer records. This is usually a problem when the customer appears in different systems within the organisation but with different account numbers and slightly different names. Trillium employs a sophisticated matching algorithm that brings customer records together from many touch points. Although Trillium punts the required e-business story in its latest product version, the reality is that enterprises have yet to integrate Internet-generated data. In the e-business world, security takes on a new urgency and Trillium has gone to great lengths to ensure that customer validation is given a high priority.

Tom Scampion, Harte-Hanks business development manager, says "implementations will take as long as it takes to decide the business rules that will apply." Because customers have little understanding of the issues and different departments don't always see the relevance or have different needs, their goals are generally not aligned. That might sound like an excuse for poor execution but in reality it isn't. The business processes that govern a marketing campaign may lead to a sales campaign. Ideally, marketing should share its results--including individual customer information--with the sales teams. But what marketing analysis can tolerate and what a sales department needs are different. This is where the question of business rules becomes important. It is conceivable that the marriage of data will trigger the potential for smoothing or creating new business processes that flow across both activities. Companies that really think it through may also need to incorporate different selling channels. Where this includes the call centre, Trillium will want to ensure the call centre service data is as much a part of the marketing, sales and service cycle. "Customer interaction demands corporate memory from as many touch points as the company can provide," adds Scampion.

With a long history and a proven record of success, one would think Trillium has a clear run at the enterprise. Not so. There is competition and as Scampion acknowledges, data quality is not exactly on the boardroom agenda, which is important to gain full leverage. Nor does Harte-Hanks put a lot into marketing. Finally, its choice of Informix as a key partner may be misplaced. Informix is going through some difficulties and is no longer the force it once was. Trillium is database independent so in theory this is not a problem. But if the company wants to take the subject to the next level it needs to cast its net wider.

Although Innovative Systems remains a private company and chooses not to disclose financial information, it has been in the data quality business for more than 30 years and in the UK since 1989. Innovative has worked with some of the UK's largest organisations.

But Mike Healey, senior vice-president of European operations, is the first to admit that although the company "is loved by its customers, on the whole it has been an under-sell, over-deliver experience." And he adds that the company tends to get drawn in on tactical rather than strategic projects. "We work on large projects but we're really more like a SWAT team," he says. Even so, those projects are often critical to the business. At Bradford and Bingley Group for example, Innovative provided the data quality audit required for de-mutualisation. At the time, the company's records indicated 85 to 90 percent accuracy. This was improved to 99.5 percent by Innovative, and as a result Bradford and Bingley discovered more than 200,000 shareholders who would have remained lost in a data black hole. Initial estimates suggested it would take approximately 100 man years to complete the initial checking. Innovative did the job in a matter of weeks.

Such services do not come cheap. Although the company can undertake a project designed to achieve a quick win where the initial costs are £25,000, a typical deal will be in the £250,000 to £500,000 range. Unusually, Innovative charges for usage rather than on a per seat basis and looks to secure each deal for a five-year period. The logic is simple: "Most systems are legacy operations within five years. Moving data to new systems has to be a continuous process," says Healey. This means that very little of Innovative's software is shelfware. Despite the apparently high price, the company claims that return on investment is typically 100 percent in the first six months. The financial services industry has proved a useful hunting ground for the company. Competitive pressures to get closer to customers and an acceleration in mergers and acquisitions, means the large financial institutions regularly face the kind of problems Innovative solves.

Its technology comes in three main modules. Core to the offering is a dictionary that allows pattern matching across 2.5 million names and 100,000 words. The dictionary has a number of configuration options which ensures customer data conforms to business rules. A data analyser looks for inconsistencies across the information, effectively carrying out an audit. Completing the data accuracy circle is a data editor. This reviews input at the point of entry, checking that data is in the enterprise's standardised form. This reduces the volume of errors usually found in systems where data is entered manually.

Innovative covers the main data quality bases, but the company recognises that it needs to move away from being a technology player. Its partnerships are not as comprehensive as others and Healey says the company is putting a big drive into alliances so the technology can be closely embedded into other solutions like call centre and database marketing. Like others, Innovative has yet to find a way of making data quality a boardroom issue. This is always difficult for technology-led companies, but has to be done if the market is to be adequately exploited.

Founded in 1987 and first coming to Europe in 1998, Vality is another private company that does not disclose its financial position. John Canavari, its vice-president EMEA, says it has ongoing plans to launch on the New York stock exchange but declined to provide further information. Canavari says enterprises are beginning to realise the benefits of improving data quality but it has yet to become an issue where Vality can take a lead.

Unlike the others, Vality prefers to talk about data re-engineering and tends to offer a consulting service as an integral part of the offering rather than simply selling a software solution. This means Vality is not restricted to talking about quality issues around customer data but broadens its scope to include product data. "Customer data is definitely a key focus, but there are many opportunities to manage product data as well," he adds. He points to work at Xerox, where product information was standardised because it had become unmanageable due to so many versions of the same product.

In the CRM field, Vality likes to talk about understanding all the ways a customer can have a relationship within the context of the enterprise records. For example, while ensuring there is a single view of the individual customer, Vality relates the individual to other customer records where there may be a relationship so that the enterprise has a "family" style view. This is a novel approach, providing the opportunity to market to small groups of individuals in a manner that treats them as a single entity but with individual characteristics

Vality relies heavily on its position as an IBM tier one partner, and in particular its relationship with IBM Global Services. IBM has undergone something of a renaissance in the past three years and has become the lead player in many new CRM and e-business projects. This benefits Vality but means it is usually brought in to solve a specific problem rather than at the start of a project. Vality's offering can be embedded as modules into other applications but this means its Integrity product is invisible as a discrete application.

Canavari argues the fear factor when he talks about the potential cost of wrong customer information. There is nothing wrong with this approach as a general principle. As the company correctly points out, the effect of compounding errors when the data is 95 per cent correct is proportionately higher when considered from the 80/20 rule standpoint. This is because anomalies usually cluster around the largest customers because their accounts are the ones with the largest number of records. As a consequence, the cost of misunderstanding the relationships that exist for those customers is proportionately much higher than for the remainder.

Vality regards itself as a one-stop data quality shop that will perform cleansing operations regardless of the environment and regardless of the requirement. From a competitive standpoint, this puts Vality in front of many problem areas suggesting a broad footprint for its Integrity Environment. In the UK, it has made an impression over a short time, scoring successes in both financial services and retail. However, its dependence on IBM is a double-edged sword. As long as IBM pops up everywhere, opportunity will exist. But that is only for today. It needs to expand its reseller and channel base so that it has more options and can be in a better position to directly influence the data quality agenda.

CRM Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues

Related Articles

The New Qualities of Trillium's Data Quality

The Harte-Hanks software division's Version 12 refreshes real-time validation capabilities, business-rules sharing, and dashboard interfaces.