Conversion rate optimization (CRO) gives you a framework to incrementally improve every aspect of your funnel over time. By analyzing the results of benchmark tests at different touchpoints, CRO offers methods to cap your customer acquisition costs,lift up retention, improve your ROAS, and more. It might seem obvious, but this represents a cultural shift for many companies. If you don’t understand what’s possible with CRO, then it’s easy to see every new feature, site update, or campaign tweak as a self-contained project unto itself. Yet when companies view CRO as a vital aspect of digital transformation, they can embrace a culture of experimentation with the goal of continuous improvement.
Solutions like A/B Tasty, Optimizely, and VWO allow you to go beyond feeling confident in your choices. With tools like these, you can attach statistical significance to a wide range of decisions across your website, product, and audience. And perhaps even more importantly, you can recognize which tests fail to significantly improve the metrics you care about. This is the essence of a digital transformation, because if you can’t understand what your data is telling you - or worse, ignore it - you’ll never see any of the benefits.
Maybe you’re already applying CRO to landing pages in your growth marketing campaigns. Or perhaps you’re thinking of using A/B testing in your product analytics practice to improve checkout conversion rates. Or you might be experimenting with different versions of your messaging within your lifecycle marketing campaigns. No matter how you approach this discipline, remember that conversion rate optimization involves far more than just the tools involved. For many teams, CRO represents a whole new perspective on their data and how to achieve their goals.
In this article, we’ll look at where CRO makes sense (and where it doesn’t), the elements of an effective conversion rate optimization program, and how you can introduce CRO to your team. In many cases, CRO is the bridge between the strategic planning required for full-scale digital transformation, and its day-to-day practical application. Take the guesswork out of decisions that can impact new user conversion, acquisition, and retention. It’s time to get serious about conversion rate optimization.
Table of Contents
Conversion rate optimization and A/B testing can help remove a lot of doubt and uncertainty from the decisions your team makes every single day. But CRO and A/B testing aren’t magic, and it’s not the answer to all your questions. You’ll see the greatest benefit from CRO only if you know when to use it, when not to use it, and how to interpret the results when you apply it to an appropriate situation.
The whole point of CRO is to improve your top KPIs. CRO works well when:
Conversion rate optimization is ideal for improving metrics related to tactical decisions, especially when it comes to growth marketing and product management initiatives. It can even be used to measure the impact of more strategic projects, for example, with global hold-out groups for email marketing and geo-testing for advertising campaigns. CRO works well across a wide range of initiatives when you’re interested in measuring a series of incremental changes over time. However, there are a few cases where CRO and A/B testing won’t cut it:
While it won’t work in these two cases, these exceptions illustrate how powerful A/B testing can be. For every major strategic initiative, there are thousands of tactical decisions that must be made every day, and many of these can benefit from conversion rate optimization. And the results of these A/B tests, when compounded over time, can inform all kinds of long-term projects. But in order to get to this point, first you need to nail the basics.
If you build a strong foundation for your A/B testing program, you’ll be able to run the right kinds of experiments more often, and you’ll be more confident with the results of those tests no matter what those results might tell you. But before we discuss the elements of an effective conversion rate optimization plan, it’s worthwhile to think about what underlies this foundation. Because even if you ace all the points we’re about to discuss, it won’t matter if your team doesn’t trust your data in the first place.
Every digital transformation initiative can benefit from investments in data governance. If you’re just getting started with CRO, this is one of the best ways to ensure accurate results. A lot of companies get so caught up in the pursuit of continuous improvement, that they forget data governance has to come first. If your team doesn’t trust the integrity of your data, it will be impossible for you to run meaningful experiments.
Sleeping on data governance is a slippery slope. Fortunately, you can fix it with clear guidelines and an adherence to transparency. The payoff for this kind of initiative is immense, since better data governance will lead to fewer errors and greater efficiency in every area of your company including conversion rate optimization. Refer to this straightforward approach to data governance for recommendations on how to get started.
Whether you’re building a new data governance plan from scratch or updating your existing policies, the following factors are equally vital to an effective conversion rate optimization program.
We’ve mentioned it earlier, and it will come up again and again: A/B testing is far more than which tool you select. It takes careful planning and buy-in from your leadership team to build an A/B testing program that will align with your business goals. Of course, the root of good planning is good data governance. But much of the success of your A/B testing program goes back to the principles of effective change management:
Before we proceed any further, can we be blunt? If you can’t get buy-in for an A/B testing program that involves multiple teams, then don’t do it! Throughout our work with almost 900 clients, we’ve been called on time and time again to fix poor implementations of a wide range of data infrastructure and growth marketing systems, including conversion rate optimization solutions. It’s never just the technology. Demonstrate to your executive team how your new conversion rate optimization and A/B testing program will change peoples’ behavior, not simply what experiments you’ll be able to run. Do this, and you’ll already be well ahead of most other companies.
And while this article cannot address every aspect of change management, it’s important to keep these steps in mind as we walk through the elements of an effective A/B testing strategy. Hint: good data governance always comes first.
Let’s consider some of the mechanics of CRO for a moment, because it can reveal a lot about what makes an experiment successful (or not). In the following examples we describe CRO and A/B tests as they apply to experiments conducted on webpages, though these points also apply to your digital products and apps.
How do you choose a goal for your CRO tests? Test design requires a deep understanding of your audience, funnel, and key business metrics. You need to understand all the details of the user action you’re attempting to change. Get clear on your current conversion rate, and also ask yourself:
This last point is especially important, as we’ve seen teams attempt to improve checkout conversion rates by changing something much farther up the funnel. In this case, there are too many factors at play between the variation and the goal of the test for the results to be meaningful.
Remember that A/B tests are best suited to short-term effects from tactical decisions. Questions like the ones we’ve shared here offer important guardrails for any test you decide to perform. Just make sure you’re testing a variation which can impact the metric you care about. In fact, anytime you plan different CRO experiments, take care that the scope of your ambitions matches the nature of the test.
We included a vital caveat while explaining the benefits of A/B testing above. Did you catch it?
Here it is again:
“If you build a strong foundation for your A/B testing program, you’ll be able to run the right kinds of experiments more often, and you’ll be more confident with the results of those tests no matter what those results might tell you.”
A/B tests aren’t meant to confirm your biases. They’re supposed to cut through your biases to reveal the likelihood that your audience will respond to specific changes in your product, website, or messaging in ways that will improve your target KPIs.
Just because you run an experiment and don’t get the results you expect, that doesn’t mean the experiment was a failure. On the contrary, you’ve learned something new that goes beyond gut feelings. And of course, recognizing insights like this becomes much easier if your executive team encourages a culture of curiosity while embracing the value of conversion rate optimization.
But how do you set up an A/B test to deliver reliable results? And how do you interpret those results once the test is complete?
To put it another way, how do you run A/B tests that meet the threshold for statistical significance?
There are entire courses dedicated to statistical significance, so we won’t delve too deeply into the topic in this article. (For a good intro to statistical significance as it relates to A/B testing, start here). In short, the statistical significance of any A/B test indicates how confident you can be that the outcome of the test is legitimate, and not a result of random chance.
How do you increase the statistical significance of your A/B tests? In order to answer this question, let’s walk through an example with Optimizely’s A/B test sample size calculator. Please feel free to tweak the values on the calculator at the hyperlink as we explain each of the options below.
You can apply the lessons from this calculator to any A/B test that you decide to perform on any A/B testing platform. However, there are a few crucial factors to consider that don’t show up on the calculator above:
We would be remiss if we didn’t also discuss statistical significance in A/B testing as it applies to your email campaigns. Except in this case, you’re not making a change to your site with the expectation that it will take you multiple days to receive enough traffic before you can check the results of your A/B test. When you’re running A/B tests on your email messaging, over 70% of the people who might interact with the variation will do so within the first day. And the vast majority of text messages are opened within minutes of receipt. So you need to be sure you’ve mapped out your A/B testing plan before you hit send, because you’ll get the results much more quickly than if you were updating your website.
There are many different factors to consider when you run A/B testing experiments within your lifecycle marketing program, and all of them will affect your baseline conversion rate and your sample size. As we mentioned earlier, when running different experiments and A/B tests, you need to make sure the scope of your ambitions matches the nature of the test. But that doesn’t mean you can’t aim high. See what’s possible when you enhance your lifecycle marketing program with crystal clear A/B testing, and learn how to adapt your culture of curiosity to measure the impact of your email marketing.
Now that we’ve shared an introduction to statistical significance, set all that aside for a moment. One of the best ways to ensure that your A/B tests are meaningful is to check that the test’s potential impact is large enough to make it worth running in the first place. For example, if you want to run an A/B test that affects multiple departments, what will be the practical implications of updating your product to reflect the results of the test? If implementing the results of the test throughout your company has the potential to create an extra $10k in revenue per month, but it will pull three of your best team members off their current projects for a week, is that worth it? And even before you plan a new test, think about whether building the test at all will be justified by the potential ROI. Considering A/B testing in this way can help you avoid misguided decisions. And it’s one more reason to get buy-in from your executive team for an A/B testing program that involves multiple departments.
Conversion rate optimization (CRO) gives you a framework to incrementally improve every aspect of your funnel over time. We’ve seen evidence of this over and over again on a wide range of projects with many different clients. But it’s what we found on the other side of this assertion that should give everyone pause:
Our intuition is often pretty terrible.
This sobering fact actually points the way forward, towards continuous improvement. The solution? Combine A/B tests with product analytics for a more well-rounded, quantitative check on our intuition. If you conduct an A/B test on a certain metric with all the right criteria in place beforehand, you’ll be able to tell whether the result was statistically significant. But if you want to see how the user journey differed between the control and the variant, or how users interacted with a specific feature on those two paths, you need product analytics.
Want a complete view of your user journey? Combine the quantitative results of CRO & product analytics with qualitative data from heatmap solutions, session recording tools, and user interviews. If you use any of these tools in isolation, that’s like trying to take a photograph of a majestic landscape and only focusing on the tree 20 feet in front of you, or the blade of grass six inches away. In these cases, it’s easier to see only what you want to see. If you want to see everything, combine CRO with your product analytics strategy.
For a deeper perspective on the relationship between CRO, A/B testing, and product analytics, please enjoy this podcast hosted by VWO. Though keep in mind as you listen, that it’s never just about the technology. The most effective CRO strategies elevate change management and data governance while recognizing the nuances of each individual solution.
By now it should be clear: spinning up a winning CRO program doesn’t happen overnight. There are many potential roadblocks on the way from implementation to adoption. Because before you even identify use cases and research different platforms, you need to get buy-in from your executive team.
For organizations who want to update or improve their current A/B testing programs, many of these same obstacles still apply. A lack of preparation in cross-functional collaboration, data governance, or any of the other elements of an effective CRO program is more than enough justification for a reset. For many companies that want to reinvigorate their existing A/B testing strategy, a ground-up rework is often the best way to proceed.
Best case scenario: your data governance library is clear, straightforward, and accurate. You understand how a new or revamped A/B testing platform will affect stakeholders throughout the company. And you know which questions you want to A/B test to improve big KPIs through a series of tactical changes. Even within this best case scenario, how do you get started?
In order to guide companies through this process quickly and efficiently, Mammoth Growth developed our Analytics Roadmap program. The objectives of this 6-week project are a complete audit of a company’s tech stack leading to a roadmap for improvements to their CRO and A/B testing strategies. Our Analytics Roadmap program allows Mammoth Growth to develop a deeper understanding of a company’s existing data governance, reporting, and CRO pain points. By completing the Analytics Roadmap program, Mammoth Growth and our clients can develop a customized plan to address their A/B testing goals.
When approaching CRO projects, Mammoth Growth follows these steps within our Analytics Roadmap program:
The outcome of this process is a roadmap for the client’s conversion rate optimization strategy: how to plan it, what results they’re aiming for, and what benchmarks define success.
For every company, there’s a unique combination of CRO strategy, technologies, and process improvements that can streamline digital transformation while allowing for more consistent, targeted decision making. Here at Mammoth Growth, as we move through the steps of mapping your Analytics Roadmap and defining a new A/B testing strategy, we adopt an agile approach to deliver business value as quickly as possible. Contact one of our experts today, and let’s talk about your conversion rate optimization goals.