What if there were simple changes to your website that could increase online revenue dramatically? You would make those changes, right? Well a site is never complete, and there is always room for improvement. All you need to do is follow a process that helps determine what changes will improve your site’s performance. By implementing A/B testing, marketers can experiment with various changes to a site and measure the impact of each change.
A/B Testing Success Story
One of Blue Magnet’s clients, a hotel in Florida, was experiencing a period of low occupancy. As their Internet marketing manager, I was tasked with the challenge of increasing reservations during the hotel’s need period. To begin, I took a step back and examined the hotel’s website as a consumer, rather than a marketer. Working with a website day in and day out can sometimes make marketers overlook issues obvious to visitors. Since the hotel’s challenge dealt with online reservations, I decided that a closer look at the hotel website’s booking widget was probably a valid starting point.
The original reservation widget contained a white and green call-to-action button reading “Check Availability.” Since the entire site’s color scheme is white and green, it seemed like the button was getting lost in the background. As a visitor to this website, I would expect the most important button on the site to command much more attention. As a marketer for this site, I wanted to see if a different color button might prove more compelling.
After some initial research, I chose red for the alternate version of the button. From what I gathered and inferred, red is a highly visible color. That’s why stop signs, fire trucks, and other things that need to be noticed quickly are painted that color. Contrasted with the green background, a red “Check Availability” button seemed like it would pop off the page, catch the eye of visitors quickly, and draw more clicks, but I needed to be sure before I made such a significant change to the site. Enter A/B testing.
Original White Button (A)
Red Button Variation (B)
I set up an A/B test to show the original button to some users and the alternate red button to others, and let me tell you, the results were exciting. After just a few weeks, the new red button was reveiving 13% more clicks than the white button. After reaching the 95% confidence threshold, I switched the button over to the red version permanently and monitored the next month’s performance. In 30 days, the total number of people clicking the button to check rate for this particular hotel improved 38% YOY which translated into 65% increase in booked revenue YOY.
Percentage of visitors that checked rate on variation A vs. variation B during the experiment.
This simple color change contributed to thousands of dollars of revenue for the hotel. If you want to make some quality changes to your own website, read on, and I’ll teach you how to implement A/B tests to improve your site’s performance.
What is A/B Testing?
A/B testing is a randomized experiment that takes two (or more) variants of a web page (A and B), presents them both to different members of the audience, and then tracks the differences in performance.
How to Implement A/B Testing
Before you decide what feature to change on a specific page, you need to determine how you want the site to improve. Think about the goal of that particular page. What purpose does it serve to your website as a whole, and what metrics indicate how the page is performing? For example, if you are trying to generate more revenue from a specific special offer, you may look at how many clicks that particular offer has compared to other offers in the same time period. If the offer has fewer clicks than you’d expect, start thinking about different page elements that could be affecting its results. Is the copy compelling? Can users clearly see where to click? Is the page layout confusing or cluttered, making it difficult to find the offer? Once you have determined what aspect of the page you want to improve, follow this simple 5 step A/B Testing process to produce a higher converting site:
- Make an educated change to the page
- Set up an A/B test
- Track how the change impacts visitor behavior
- Implement the improved version
1. Making an Educated Change to the Page
Once you determine what the goal of your page is, decide what the “B” in the A/B test will entail. This decision requires some thought. Don’t make a rash judgment, but don’t let this step bog you down either. Spend some time considering what changes on the page will fuel changes in site performance. Try to view your website from a visitor’s perspective. How does this page look to you if you landed on it for a specific search query? Is the information you’re searching for easy to find? If you were directed to this particular page from another page in the site, what would you expect to see? As a visitor, is there an action that you can take to accomplish your goals (e.g. contact for more information, click a button to check out, etc)? If this exercise does not help you discover an element to change, it may be useful to get a fresh set of eyes on the page and hear from an outside perspective. Ask a friend, family member, or some of your top customers for their valuable feedback.
As a fellow marketer, I know your time is valuable. I wouldn’t want you to read this article and then set up A/B tests that make little to no improvement to your site’s performance. Do your initial research and use your best judgment to determine your “B.” Whatever you decide to change, it doesn’t always have to be huge, but it should be purposeful.
2. Setting Up an A/B Test
Google Analytics makes setting up A/B tests simple. First, you need to create a duplicate page to test against the original. This duplicate page should contain your variation(s). While your duplicate page is still being tested, I would recommend that you set it up with a meta noindex, nofollow tag so it doesn’t appear in search results.
Once you have the duplicate page ready:
- Go to the Behavior tab in your Google Analytics account
- Select “Experiements”
- Click “Create New Experiment”
- Follow along with the questions
The objective you select to measure for the experiment should be a metric that indicates the page’s strength. Google Analytics allows you to pick from your present goals, some site usage statistics such as bounces, pageviews, and session duration, or you can create a new goal for the experiment (e.g. contact form completions, check availability clicks, wedding form RFP submissions, etc.).
There are a few other options to determine how your experiment will be run and measured. You can choose the percentage of traffic you want to participate in the experiment, and you can also choose how the two pages will be distributed to your site’s traffic. Google’s default will show the page that is performing better more often, but if you wish, you can choose to show the two pages evenly. Finally, select the confidence threshold you want to reach before the experiment is stopped. The higher the threshold, the more confident you can be that the changes you are making will produce improved results.
Next, paste the URL of the original page and the URL of the test page into the provided boxes. I suggest naming them something that reflects the variation of that page, such as White Button vs. Red Button.
Finally, Google will provide you with a snippet of code that you need to paste into the header of your original page. This code will redirect certain visitors to page B, your test page. You may need to have some technical knowledge or work with your developer to implement these changes into your Content Management System, such as Joomla or WordPress.
Once you have set up your A/B test, you can sit back and analyze the visitor behavior!
3. Tracking How the Site Change Impacts Visitor Behavior
Google does a great job of clearly laying out the experiment statistics. The data will include the number of sessions (visits) for each page version, the number of conversions, the conversion rate, the difference in conversion rate compared to the original page, and the probability of outperforming the original page (that’s your confidence threshold). It’s fun and useful to monitor this fairly frequently. If your test page is not producing improved results, you may want to end the experiment earlier than you had intended and try implementing a different change instead of holding out hope for months on end. If you changed the color of a “Check Availability” button and didn’t see an increase in number of click in two weeks, maybe you want to keep the original button color and try different verbiage, such as “Book Now.”
4. Implementing the Improved Version
This is a no brainer. Once the A/B experiment is finished, pick the version that performed better (depending on what your goals are) and make that the default page. Voila! You have just made your site better and will soon have the measurable results to prove it!
5. Repeating the Process
It is awesome that your site is performing better, but the job of an online marketer is never done. Now that your page is producing improved results, set up another test to beat the new version or take what you have learned from this experiment and apply it to a similar A/B test for another page. Winston Churchill said it best: “To improve is to change; to be perfect is to change often. “
Have you tried A/B testing on your website? Tweet us at @Blue_Magnet and tell us about your experiments.