Sir Francis Bacon published Novum Organum in 1620 and with it staked his claim as the father of modern science. The work kicked off a centuries-long quest to gain knowledge through controlled experimentation. Bacon's concept of the scientific method encompasses the idea that truths should be proved by the testing of a single variable against a control; analyzing the results of such experiments paves paths to understanding the physical world. So, what do a 17th-century philosopher and 21st-century marketers have in common? Bacon's methodology, developed more than 400 years ago, is remarkably similar to that of A/B testing, the Web optimization process used by many of today's online marketers.
Recently, however, A/B testing has come up against some criticism and some competition in the form of multivariable (or multivariate) testing. Both methods can work to help marketers develop better Web sites and improve customer click-through rates. A/B testing does so through an apples-and-oranges strategy of testing two different Web pages (or offerings) and finding how they perform against each other. For example, a marketer could show half of its customers a landing page with a teddy bear on it, and show others half a picture of a unicorn, and could uncover which image caused more people to stay on the site and make purchases. Multivariate testing uses more complicated algorithms to test many variations of one page. For example, a multivariate test could compare the picture of the teddy bear and unicorn while also testing a number of background colors, text sizes, and content organization, testing every combination these design factors can create. The marketer may therefore find that the unicorn, placed at the top of the page on a blue background with a large font, was the best performer.
Due to the simplicity and unspecified nature of A/B, those marketers in the multivariate camp have begun to see the apples-and-oranges method as testing dinosaur. Why would you test two factors when you can test 2 hundred? Why would you compare features of your Web site when the whole thing can be optimized? And how could Bacon's bits be reasonably applied to online marketing in the first place?
Even in the face of this flack, however, A/B testing methods have endured. Although both analytical approaches have advocates with strong opinions, each has its place in marketplace analytics. A/B testing should not be regarded as multivariate testing's cruder precursor, nor should multivariate testing be understood as A/B's overhyped, untrustworthy counterpart. Instead, marketers must learn to see both methods as useful instruments, each with its own advantages and drawbacks, to be used in separate scenarios to reach different kinds of goals.
Why Testing is Pop
Gauging the success of campaigns is not new behavior for marketers, but with the rise of Web marketing and increasing pressures for marketing initiatives to deliver clear ROI, testing has risen from a hidden back-office activity to the status of a buzz word. Emily Riley, an analyst at Jupiter Research, says, "When people buy things in stores it's hard for advertisers to really decide what the outcome of an ad needs to be. Many [marketers] are used to television, where all you're looking for is reach and frequency, and where there really isn't any response at all." Online marketing, on the other hand, is almost completely measurable. Unlike with TV spots, newsletters, or billboards, marketers can track consumers' interactions with the marketing campaign, how they look at it, how their interest travels across the screen, what particular pieces they respond to, and how many make purchases in response to the campaign.
The ability to measure online responses has heightened the need for marketers to demonstrate performance with numbers, rather than with flashy ad campaigns. Because the tools Web marketers use are inherently measurable, in a post-Enron world marketers must have the means to ensure that their efforts will deliver not only in creativity but also in dollars. Testing provides a way for marketers to optimize the look, feel, and content of online offerings and existing campaigns, and to try new ideas on a very small scale so that losses will be minimal should the attempts fail. Additionally, testing allows marketers an amount of freedom from choice: Instead of being blamed for poorly performing campaigns due to bad creative choices, marketers can let their audience vote through clicks, thereby making the decisions for them.
Phil Fernandez, CEO of on-demand for B2B lead generation firm Marketo, says, "If you put 10 marketers in a room and show them some creative material, you'll get 15 answers about what is best. I mean that as a joke, but it's really quite serious." Despite deep experience and good instincts, marketers often make the wrong decisions. If Fernandez's hypothetical dilemma was powered by testing, all of the marketers could agree to go along with the top performing creative according to consumer feedback.
Easy Does It
For Web-related marketing A/B testing compares discrete variables, such as two separate Web-page designs. Although its name might be misleading, an A/B test can actually be an A/B/C test, or even an A/B/C/D test, experimenting with a handful of distinctly different things marked against each other. Marketers have used this method in the past, for example, by mailing two different offers or putting two different logos on product labels. However, the Internet has made these types of tests much more practical.
A/B testing may in fact be a better long-term fit for companies whose budget and technology cannot support a multivariate approach or for those whose sites do not generate enough conversion to test hundreds of different permutations. Fernandez is an advocate of A/B testing with the understanding that sometimes the basics are more important than the details. "You have to understand that you don't need to boil the ocean to cook the fish. We believe that there's already so much complexity to all of the testable pieces [on a Web site] that there really needs to be a very, very simple method for the marketer." He explains that for Marketo's clients, which are strictly B2B, multivariate testing is not as effective, because of sites' lower conversion rates. Unlike retail companies that sell a lot of products with lower price points, B2Bs generally make many fewer sales, but their price tags are much higher. Therefore, if a company's goal is to convert more consumers into buyers and the firm is trying to figure out which version of its Web page best makes that happen, the company will have many fewer positive data points if it is a B2B.
The same holds true for many smaller companies. Logoworks, a provider of consumer graphic design services, sells services in the hundreds of dollars range; the company sees many fewer conversions than Amazon.com or Verizon would. Logoworks uses A/B testing to improve marketing performance online. Craig Scribner, Logoworks's marketing analytics manager, explains that this makes sense because the traffic level cannot support a testing with many different variables. "The more variations you run at a single time, the longer you need to run it to get statistical significance. You don't want to act on an anomaly. You don't want to make decisions on a test that is well under the radar." Another plus of A/B testing is that a company can test entirely separate offerings, rather than similar details. Scribner cites a second reason for Logoworks's A/B testing approach: "We want to test ideas." (See the sidebar "An A/B Advocate" for more on Logoworks.)
Multivariate testing allows marketers to explore the best ways to express these ideas. The approach is often a better fit for companies with high-traffic Web sites looking to optimize these sites and marketing material at an advanced level. Jupiter's Riley says that multivariable testing "is better for a much more complex advertiser that has many different creatives and potentially is looking for very wide scale media buying, so it may only find that different versions work in different places." Multivariate testing can be a more difficult and expensive process; however, Riley says that it is becoming more automated and therefore easier for companies to manage. Although the multivariate approach may not be the best fit for every marketer, it does have notable benefits.
"A wise man will make more opportunities than he finds," Sir Francis said. Multivariate testing helps marketers to do much more specific testing than the A/B style allows. Consider a marketer testing two completely different Web pages, page A and page B. The marketer finds that page B resulted in a 6 percent higher conversion rate than page A. It is possible that page A had some factors that worked 3 percent better than page B's, but that these advantages were masked by other high performing factors on page B. With the high performers of page A and page B combined, the conversion rate could have been 9 percent higher, rather than 6 percent. A marketer could isolate these different performers with A/B testing, but would have to test these factors two at a time. Multivariate testing enables the marketer to simultaneously test them.
When Stamps.com, an online stamp design services company, made the change from A/B to multivariate testing, one of the greatest improvements the company saw was in time saved. Sebastian Buerba, marketing director at Stamps.com, says, "Switching from A/B to multivariable testing really increases the speed of testing amazingly. What with A/B testing might take you three months would take you two weeks." (See the sidebar "Testin' the Night Away" for more on Stamps.com.)
Multivariate testing lends greater specificity to what is on the Web page, and to who is looking at the page when. Companies can, for example, test how various images and offers succeed at various times of the day and night. Additionally, companies can start looking at consumers on a more segmented level by using the technology that powers multivariate testing. Instead of seeing what most customers respond to--the blue background compared to the red background, for example--and then delivering that most popular background to all customers, marketers can try to measure individual preferences and deliver the red background to those who will most likely respond to the red and the blue background to those who will most likely respond to the blue.
Beyond the Battle
No matter what method your company chooses to use, in today's online world it is crucial to test. Fernandez says, "Doing any testing is more important than arguing about what kind of testing." However, companies should not employ testing measures as a means to an end. It is critical to first implement strict goals and processes before any kind of technology is introduced. Riley says that a lack of clear objectives can be disastrous. "The biggest problem for small advertisers today is knowing where to start before they get themselves in too deep and can't find any good results."
It's also important for marketers to think about testing as one channel in the larger path of improving conversion rates and online business as a whole. Tests and results must be acted on and analyzed for these efforts to be effective. Additionally, companies must not rely on the numbers completely, but continue to analyze business practices once a testing program has commenced to make their marketing smarter. Steps such as behavioral targeting and segmentation can further help a marketer in the quest for conversion. Also, by constantly outlining and refreshing goals marketers will not only implement best practices, but will identify whether A/B or multivariate testing is most appropriate for their endeavors. Adam Sarner, principal analyst at Gartner, says, "[It depends on] the goal that you're going for. If it's the red shoe or the blue shoe, you'll be able to do it through A/B testing. If the complexity starts getting larger and larger, you use the statistical method to help you along a little bit."
Contact Editorial Assistant Jessica Sebor at firstname.lastname@example.org.
An A/B Advocate
Logoworks, a custom graphic design services company that helps match freelance designers with small businesses, uses Omniture Web Analytics to power A/B testing on the firm's site. Logoworks finds A/B testing (over multivariate) works best for it because a high price point on services means a lower conversion rate, and, therefore, a relatively small amount of click-through data.
Logoworks now tests marketing initiatives, keywords, and different creatives with Omniture. Crag Scribner, marketing analytics manager, remembers a time before optimization. Scribner recalls that before using Omniture on all creative changes, an executive at the company was dissatisfied with pay per click conversion rates, and so designed a new landing page. When Scribner asked if the exec wanted to test the page, he was told it couldn't perform worse than the existing page. "Immediately that channel just tanked," Scribner says. "I learned the important lesson of making a data-driven decision."
Logoworks's A/B testing now allows the company to see which channels are performing the best to isolate which channels should be invested in most, and to measure the long-term effects of campaigns. Although the company has considered multivariate testing, at this point it would be impractical and unmanageable. Scribner says, "Having the data and being able to speak from the data has been really empowering." --J.S.
Testin' the Night Away
Stamps.com, which creates personalized stamps for businesses and consumers, can tell the story of making the change from A/B to multivariate testing. Before implementing Offermatica to power multivariate testing, Stamps.com was running into problems with its in-house A/B testing approach. Speed of testing development, inability to look at consumer segmentation, and heavy reliance on the IT department made testing more difficult than it had to be, according to Sebastian Buerba, marketing director.
After implementing Offermatica, Stamps.com saw a 13 percent increase in conversion on its Web site. Instead of testing one or two alternatives to a landing page, Stamps.com can now test over 100 permutations to find out the most effective combination. With the new multivariate approach, Buerba says, "We're taking more risks and we're open more to testing new ideas. Before we were more conservative." He says that although the site can now support multivariate testing, for simple tests, Stamps.com still uses an A/B approach. "It really just depends on how many elements you are thinking of testing," Buerba says. "We are running multiple tests right now at the same time; we are testing all the time." --J.S.