Hello Indiana, A much-requested topic you have all asked me to write about is AB testing and how to get started. I have avoided it for a while now because it is something I wrote about back in 2018, and the article is generally as relevant today as it was back then. In fact, I would encourage you to read it if this topic interests you.
Try listening to these emails. Perhaps you will prefer it. Consider it an AB test.
That said, I have finally decided to give in and cover it here for two reasons.
First, Google Optimize was the obvious starting point for those looking to get into AB testing in 2018 because it was free and easy to use. Unfortunately, Google has since decided to retire the platform. This makes it much harder to get into the field.
Second, the article I wrote previously didn't suggest a workflow to use when AB testing, and that is something I would like to explore more today.
If Not Google Optimize, Then What?
Let's start with the elephant in the room - with Google Optimize now out of the picture, what should you use? That is not an easy question to answer. There is no shortage of contenders, with my personal favorites being VWO, Convert, and Optimizely.
However, these platforms are expensive, with prices running into many hundreds of dollars per month. That is a big commitment if you haven't done much AB testing in the past. It can also be tough to work out which of the many options are best value for money.
You could start using VWO's free tier to try AB testing and make the case for paying for the service. However, you will be limited to 50,000 tracked users a month. More than that, and you will have to pay.
Alternatively, you could use Crazy Egg, which offers limited AB testing facilities and several other valuable services for a much more reasonable price than other platforms. However, you will encounter limitations such as only being able to run one test on a page at a time.
Whatever tool you choose, setting up and running tests is fairly straightforward. You create variations that you want to test against your control version (the version currently on the website) using a WYSIWYG editor. Then, you define a successful conversion by creating a goal. This might be something like clicking a button or filling in a form. It's all fairly intuitive.
So, let's presume you have settled on an AB testing platform. Now you have to decide what to test.
Where AB Testing Ideas Come From.
Fundamentally, the decision to run an AB test can come about for four reasons.
Something in site analytics indicates a particular part of the site is underperforming, so you want to experiment with ways to improve that page.
You have identified an improvement but want to mitigate the risk of rolling it out by testing it with a segment of your audience first.
You or a stakeholder is unhappy with some aspects of the website and wants to gather data to support or refute this feeling.
There is disagreement about the best approach in a particular circumstance, and you want to resolve the situation by testing.
The first two are typically the best reasons to test, but do not underestimate the value of the the latter two. AB testing can be an effective and fast way of resolving disagreements and reassuring stakeholders.
Of course, the danger is that if you open the floodgates for every wacky concern or idea stakeholders have, you will never get around to testing real problems identified through data. To address this issue, you will need a robust process.
How to Prioritize and Manage Your AB Testing.
I favor creating a simple set of steps that any idea passes through, which I organize into a Kanban board.
Ideas for tests are organized into an "ideas" column. Each idea is then reviewed, and a short proposal is written. The proposal outlines:
the hypothesis, including the expected outcome. For example, "changing the button's color will encourage more clicks."
supporting argument. Why do you believe the test will provide positive results? What data supports it? For example, heat map data indicates users are not seeing the button.
the nature of the test. For example, how many versions of the button color do you intend to test, and who are you showing the variations to?
Once a proposal is written, it is moved to a "proposal" column on our Kandan board. Once there, it is reviewed by stakeholders, and they then assign it to the "approved," "rejected," or "later" columns. Approval should primarily be based on the evidence. But running a test just to shut up an opinionated stakeholder is valid, too!
Proposed tests that have been moved to the "approved" column are prioritized and eventually rolled out. When they are launched, the test moves to the "live" column, and when completed, the results are written up and moved to a "done" column for final review.
In that final review of the results, stakeholders can decide whether to implement the winning variation, propose new tests, or both.
A Word of Warning
There is so much more I could say about AB testing and one day, I may write a more comprehensive guide, but hopefully, this will get you started. That said, I am very conscious that I have left a lot out, so if you have questions, email me at paul@boagworld.com, and I will do my best to help.
But I want to end by saying it is crucial to understand that AB testing is not always the most appropriate testing method.
AB testing works great for simple changes such as altering text or changing buttons. However, testing a prototype may prove more effective when making complex changes, such as introducing a wizard instead of a form. That is because you would effectively have to make a fully working version of a wizard to AB test with it, which could prove to be a waste of resources if the wizard then turns out to underperform.
Ultimately when it comes to testing you have to use the right tool for the job and not presume AB testing is always the appropriate approach.
No comments:
Post a Comment
Keep a civil tongue.