Facing the Challenges of Fraudulent Web Traffic
It's no secret that fraudulent traffic is an issue that is plaguing the Web. Bots are imitating customers online, increasing the amount of traffic on Web pages, and, as a result, the amount the publisher can charge for advertisements. No site is immune; according to the Interactive Advertising Bureau, roughly one-third of online traffic is fake. Marketers are working to address the issue as it pertains to advertising. But fake traffic can also have a major impact on customer service analytics when bots reach online self-service channels.
The Light and Dark Sides of the Force
In addition to creating fake traffic and cheating advertisers, bots can infiltrate self-service channels and attempt to post spam or ask questions. These bots create problems for companies when it comes to drawing insight from customer service analytics. When legitimate customers ask questions through online self-service channels, it creates a valuable source of real-time information that can ultimately shape marketing strategy. For example, analyzing data collected from virtual agent technology that has been deployed on a bank's Web site might indicate that a majority of customers are confused about how to perform a particular action, such as transferring money. The bank can then proactively make changes to the service so the experience becomes easier for customers. But if the data from the bank's customer service channels is skewed, it can be difficult to draw actionable insights, causing it to miss out on a valuable opportunity to improve service.
Not all bots are bad. Some search the Web to identify new sites and content. For example, Googlebots crawl billions of Web pages to help discover new pages to add to Google's index. This helps make Google's database more comprehensive and improves the accuracy of page rankings, and can benefit companies by increasing their visibility on the search engine. For this reason, companies often do not want to block Web crawlers from visiting their sites altogether. In fact, a company might even run its own Google Search Appliance to index its site content and help improve its ranking on Google. However, companies still need to be wary of these bots. As they crawl the Web to index content, they may click on individual questions from a company's FAQ section, which can also skew reporting data.
Bots are a challenge even for the major players, such as Google, which are continually working to combat this issue. But there are strategies companies can take to help distinguish clicks from a customer from the clicks of a bot. To help fight the bad bots, companies can manually block bots by their unique IP address. However, this is an ongoing process, as new bots with new IP addresses can continue to appear. Blocking all bots can negatively impact the site's ranking in search engines, so this strategy is commonly used for preventing spam bots, which attempt to post content on Web sites, as opposed to Web crawlers, which are simply used for site indexing. In addition to blocking IP addresses, companies are also starting to discern patterns, allowing them to confidently and safely block these attempts.
In the case of Web crawlers on an FAQ page, companies can use filters that allow bots to crawl their site, but categorize them as crawlers so that their clicks do not impact analytics. To accomplish this, organizations can look at their Web site visitor location data, look up the IP addresses that are asking the greatest number of questions to determine which are coming from Web crawlers, and exclude these from their data.
Web crawlers skewing analytics is a problem that has existed for years. However, with advances in technology, companies are now able to capture more data than ever before, and can use this data to gain valuable insight from Web analytics. By taking the proper steps to proactively filter Web crawlers, companies can reliably use data derived from customer service channels to inform future marketing strategy and ultimately improve customer service.
Mike Hennessy is the vice president of marketing and alliances at IntelliResponse.
The Downside of Automated Responses
Responsibility and strategy are key to using this growing technology.