Direct response marketing’s real power lies in your ability to test it. Pretty obvious when you think about it, but something most people miss or get wrong when they try it.
How odd… people screwing up by dabbling in things they don’t take the time to understand.
Most advertising and marketing is completely untestable, meaning the advertiser or marketer has no idea whether it’s actually working or not. And this is true whether they’re using direct response or any other kind of advertising. Meaning, if you run identical ads in different publications and have no way of tracking the results each of them generates, then you have no idea if the ad is working or not, or if it is and you can see that, perhaps, because you notice a bump in business, you have no idea which publication the ad is working in (and it could be working in both).
“If you run identical ads in different publications and have no way of tracking the results each of them generates, then you have no idea if the ad is working or not”
This also applies to direct mail and all other forms of direct response marketing, not just advertising.
And of course probably the most powerful test of all is the…
Direct Response Split Test
In short, you get two different ads and send them to a homogeneous audience with a random distribution and see which one gets the best response. Then you do it again, using the best-performing one and a new one; lather, rinse, repeat until you’ve got a stupendous ad that brings home the bacon as predictably as a trip to Tesco.
This is sometimes called an ‘A/B test’ and it used to be easy to do even in printed publications. Now few places seem to do it, but you can kind of get around that by using free-standing inserts.
But all is not lost, because it’s dead easy to do using an autoresponder service like Aweber. See the pic on the right for how it looks in the dashboard. Dead simple, really.
But there are some pitfalls, in that if your test is crap, then the results won’t tell you anything. For instance, putting one ad in the paper this week, and a different one in the next week isn’t a true test, because a lot can happen in a week – the weather changes, events happen to suppress or enhance response (like national budgets, bad news, etc.) and so your thinking can be skewed.
But a really cool thing, when you get it right, of course, is…
Direct Response Testing Can Reveal “Gurus’” Truths Which Ain’t
There are lots of ‘everyone knows’ facts in marketing, yet few of them are really facts at all. Or, to be more accurate, most of them are true most of the time in most circumstances. But since many of the gurus are really just parrots, they’ve never tested any of this for themselves so don’t realise they’re talking bollocks.
- Everyone knows a squeeze page should be short and sweet, with the opt-in box ‘above the fold’ (meaning, visible without the visitor having to scroll down to see it). But my full-page print-ad, when converted to a squeeze page with the optin several screens, outpulled the traditional format by 12.5%. Will it always work this way? Probably not. But you’ll never know unless you test it both ways every time you do it.
- Everyone knows an optin page pulls better when you have a graphic showing an ebook cover or audio as a book or CD, so it looks like a ‘real’ thing they’re getting and not a download. This is true… except when it isn’t and the graphics actually reduce response. Will it always work this way? Probably not. But you’ll never know unless you test it both ways every time you do it.
- Everyone knows audio and video pull better than just copy in optin pages because it’s more like being in-person. This is also true… except when it’s not. We tried it on Sarah’s blog and response didn’t just fall… it bombed. Will it always work this way? Probably not. But you’ll never know unless you test it both ways every time you do it.
And for a finale, you can watch this short video:
The fact is direct response is full of surprises and despite what the gurus tell you, you won’t really know what works best until you actually test it.