Posted Feb 3, 2003
Well, it's 2003, and it's no secret that ROI of any investment in technology is what is driving spending. If you're in marketing, and you have a pulse, you're probably pretty astute about Web site log analysis packages and their results. You probably know that a metric called a "hit" is pretty useless to anyone besides your IT team, and that monthly and weekly stats like "page views" and to some extent, "number of visits" during a week or a month, while interesting, don't equate to anything especially tangible when you really drill down into it. So in a perfect world, what would be a valuable piece of analysis companies could use to measure their investment developing and marketing a Web site or application?
It depends on your business. Obviously; if you run an online store, you'd want to know how your marketing campaigns specifically drive sales. For example, which ads caused which people to buy which products? You'd probably want to create targeted lists and campaigns based on that analysis, to drive sales and make purchasing decisions.
In another example, let's say that your approach is more of a softer sell; perhaps you're a software or pharmaceutical company that sponsors a destination content site that preaches the value and merits of your products, but doesn't sell them directly -- it gently points them at another site or company that actually closes the sale. You'd probably have specific content groups made up of key topics that were of interest to those readers. You'd likely send email broadcasts to the subscribers of those sites and perform online surveys with that subscriber base to get a better read on what's important to them, when they're planning to make buying decisions, and to gain a better understanding of the demographics and psychographics of that constituency.
In either scenario, ideally, you'd like to know:
The usual troubles
In either of these scenarios the data you need to analyze is very likely in disparate sources. Your Web site traffic is in the Web logs; the customer purchase history is in a CRM or custom solution; and the subscriber email response and survey data is in another application. Moreover, you may even need to go back to a content management system to determine content groups. Reporting is done one source at a time, and the connection between the data sources is nonexistent, putting a tremendous burden on the marketing manager to attempt to manually corroborate the data -- which will oftentimes be a "best guess" as to what is really going on. You've probably used tools such as standard Web traffic analysis software. However, these tools are decidedly disconnected from far more interesting and valuable customer or subscriber data.
- How many visitors were registered users versus not?
- Of the registered users, what did they read/buy/think?
- Of the registered users, who has been reading certain content on my site? Based on their survey answers or buying habits, are my assumptions about their interests aligned with the content groups?
- Is there corroboration between some of their actual buying habits and the content they're reading?
In addition to the challenges presented by this disparate data collection and analysis, lower-end log analysis packages may not even provide the capability to group data-driven pages into logical content groups, making many business-critical content management systems a major analysis speed bump for marketing managers.
Solving the problem
Unless you've spent dot-com-era level venture money on building incredibly complex and robust software systems for your site or store, you probably haven't built all of your systems to capture and report the Web site traffic and your customer or subscriber behavior. And even if you have, adding new applications and features over time will be far more complex and expensive as analytics requirements must be considered in every modification.
The key to making analysis elegant and manageable is to abstract the applications from one another, but give them a common thread with which to connect the customer/subscriber to the Web site logs.
Higher-end business intelligence products provide facilities for this purpose. Typically, custom cookies that contain the customer or subscriber ID are planted on the client machines for visitors that are logged in to the Web site or store. The Web server is configured to log this customer or subscriber ID in every hit to the site or application. This "dialogue" between client machine and Web server is key to corroborating the session data with the other systems.
Once the cookie capture is developed and deployed, the business intelligence application is typically configured to aggregate the data from each of the sources. Generally speaking, one of the most valuable functions that the software vendor provides is the high-performance "sessionizing" of Web traffic into an aggregation database.
Once Web traffic has been "sessionized" in such a system, it can be linked with the customer or subscriber ID information in your external systems. Depending on the site traffic and volume of data in the external systems, reporting on "everything" can then be done two ways.
First, if the data is relatively small (let's use less than 1GB in aggregate data), custom browser based reports can be written "by hand" using back-end SQL, and a standard front-end technology (JSP, .NET, ASP, ColdFusion, etc). It's quite simple to program reports that report on traffic to a certain page or group of pages, and filter that report by people who've answered a survey question a certain way (Example: How many people read a case study on your site and answered that they planned to buy a related product within three months). With these reports in place, it's a very quick jump to create a list of those subscribers and perform highly specialized marketing and sales programs accordingly.
If the volume of data is higher, an OLAP solution can further aggregate the data, and the vendors that provide business intelligence solutions can provide users with a standard set of Web-based OLAP analysis tools as a start. It is sometimes easier to get more granular results using the first method, if the data set is small enough. If you are dealing with a site that is serving up huge amounts of traffic, and you are more interested in trends themselves over the specific results for one-to-one marketing, it may make more sense to go the OLAP route. Of course every business is different, as are the goals of every project; size is not the sole consideration.
What can I do now to make sure I can analyze later?
It's important to note that you can start capturing the necessary data in your Web site traffic immediately for future plans to connect analysis. You may not have looked at or procured any business intelligence software yet, but you can still be capturing that all-important cookie information in your site traffic today. If you don't, there's no way to connect the historical site traffic.
Most important, as is true in any data aggregation and analysis project, it is critical that the party involved in the implementation is extremely business-focused and experienced in the technologies that power the solution. If your internal resources or external partners are doing their jobs correctly, they'll be working with you to define a list of business-focused reports before they recommend a product or start programming. Ultimately, it's critical to have a clear vision of what information is truly of value -- before you start playing with software and burning development dollars.
Sponsored By: Informatica
Sponsored By: Microsoft
Transform business process automation and people productivity with Microsoft Dynamics 365 and the Microsoft Cloud
New Gartner Report from Microsoft
Sponsored By: Microsoft