We are constantly being told that we should split test. Compare A against B and pick whichever is better to proceed with. Then test the winner - either A or B - against C, and so on, constantly refining until we get the best headline, the best ad, the best email that we can possibly achieve.
But what do you do when the split test yields strange results? This was a situation in which I found myself recently. Let me explain.
As many of my readers know, I send out a weekly email newsletter detailing what I consider to have been the top articles on internet marketing and social media that have been published in the previous week. Like any email marketer, I am constantly trying to think of ways to get more of the recipients to open the emails. And I have written before on this blog about my thoughts as to what I should emphasise in the subject line.
Well, a week or two back, I decided to do a split test. I divided my mailing list into three fairly equal sections, based on the first letter of each recipient's first name, and sent out my newsletter with a different heading to each section.
On the week I did the split test, two of the articles near the top of the newsletter were "102 Things I've Learned About Internet Marketing" by Rebecca Babicz, and "The 12-Month Path to Social Media Success". So I gave the first newsletter the subject line "102 Things You Should Know About Internet Marketing" and the second the subject line "Your 12-Month Plan for Social Media Success". The third newsletter had the subject line "Getting the Best Out of Facebook & Twitter; the Top SEO Apps; & Introducing Pheed" which was a compilation of a number of articles.
My aim was to find out whether a specific subject would entice more people to open the email than a compilation. Now I long ago learned that numbers themselves can be misleading, so I found an excellent and easy to use little calculator to determine the statistical significance of my results. And the outcome was that, although there were differences in the numbers of people who opened each version, these were not statistically significant. So it seems that using a 'one story' subject line or a 'compilation' subject line is equally attractive.
But here's where we get to the strange part. The subject line seemed to have no effect on which articles the recipients clicked on. My thoughts had been that those who opened the '102' newsletter, for example, would have clicked on the '102' story. But, in fact, there were far more who didn't than who did. And that was the same for the '12 month' story. In all three newsletters the most popular article was one about how to customize your Facebook page. And the numbers who opened the '102' and the '12 month' stories were much the same for all three newsletters.
So this leaves me wondering what it was that drew people to open the newsletters? Clearly the majority of the people who opened the '102' newsletter, for example, weren't interested in reading the '102' article. And the same goes for the '12 month' article. Which makes me think that there's a hard core of readers who regularly open the newsletter week after week because they've found it valuable - and who will open it regardless of the headline, while just a few open it because the subject line has caught their eye.
And, of course, this leaves me no nearer to discovering what sort of subject line is going to get more people to open the newsletter. I'm sure split testing is a great idea - but sometimes it just doesn't give you the answers you want!