Direct response advertising is much more directly related to street salesmanship – think of an insurance agent closing new policy customers, a book seller hand selling copies of the latest mystery novel, or even a vacuum salesman going door to door. These efforts only succeed if there’s a measurable outcome – in other words, a sale.
As the world of internet marketing evolved, direct response has had a stronger influence on the way marketing online has come to be practiced. This mostly comes down to the fact that online marketers and businesses growing online channels are doing so with a clear goal: to attract new customers, make new sales, or grow their prospect/lead list. Therefore, when the goal is very clear, it’s easy to know whether or not your efforts were a success.
Every online marketer needs to be approaching his or her blog with this kind of ruthless scrutiny.
The problem of course is the idea that these things are hard to measure. You write some copy and throw it up online. You send an autoresponder series out to new subscribers. You change the design of your website and hope that it reduces your bounce rate. But is there a way that you can actually tell how successful any changes that you make are?
Absolutely. This is where A/B testing comes in.
A/B Testing Gives You Valuable Info If Done Right
In A/B testing, you pit an existing design or set of copy (called the control) against a different version. It’s a control because you know how well it performs. For example, you write a sales letter, create a squeeze page, and put it online. It sits there for three months, trying to sell a product. At the end of the trial, 10,000 people saw the page and exactly 4% converted (purchased).
In an A/B testing scenario, you pick one variable and you change it. This could be the headline. It could be the price. It could be the positioning of the video introduction. It could be the bonus you give away.
But whatever variable you choose, you make the change and then you test how it performs. Let’s say you change the headline, and after a period you find out only 3% are converting. Then you clearly should stick with the control.
But perhaps when you change the bonus, you find out 6% are converting. Your new, higher converting letter becomes the control and you begin to test other aspects to see if you can push conversions higher.
— James Malone (@jmalonepr) February 18, 2017
With A/B testing, you can test an infinite number of variables in pursuit of that optimally successful sales letter. Let’s take a close look at case studies by some of the top minds in A/B testing today:
Case Study 1: ICoupon Blog
The geniuses over at Visual Website Optimizer were working with a customer, ICoupon Blog. They tested a single variable. Would a site that featured a prominent security badge (as in to indicate secure transactions) convert better? It turns out – the page without the security badge increased conversions by 400%. Read the full case here.
Case Study 2: ScandanavianOutdoorStore.com
In a simple headline test also orchestrated by the guys at Visual Website Optimizer, they tested (in Finnish!) the difference in conversions from “Men’s Clothing” to “Buy Men’s Clothing at Bargain Prices.” Conversions increased by 127%.
Case Study 3: Dustin Curtis
In a well-known test case, Dustin Curtis wanted to see which headline would get people to follow him on Twitter. Turns out “Follow me on Twitter” was more effective at 7% than I’m on Twitter at just 4%.
So if you’re ready to start improving your conversion rates, maybe it’s time to take a look at how to integrate A/B testing more effectively into your own sales, marketing, and design process.