Max van der Heijden is a user experience and conversion specialist at Google who works with companies across Europe, the Middle East, and Africa. Max shares his thoughts about how companies can build a culture of growth and experimentation.
How many times have you launched new features or page designs on your website without testing first?
In an ideal world, companies should test everything before rolling out site changes. But some websites have too little traffic to generate credible results from experiments, and some bugs should just be fixed if they prevent users from achieving their goal. At the very least, analyze your analytics data and use qualitative methods such as user tests and surveys to validate any improvement ideas you have before implementing. If you have the traffic volume: Test!
I’m part of a team at Google that works with advertisers to identify opportunities for improving website user experiences through experiments and testing roadmaps. When our team of UX specialists begins consulting with a new company, the first three things I tell them are:
- The possibilities for improvement are enormous. Even if an experiment increases your conversion rate by “just 5%,” you can calculate the positive effect on your revenue.
- What works for one may not work for all. No matter how many times we have seen recommendations or “best practices” work on other — maybe even similar — websites, that does not mean it will work for your users or your business.
- Expect failures — and learn from them. Testing takes time, and it's hard to know which tests will pay off. Embrace failures and the lessons learned from them.
Making the switch from “get-it-live” mode to a test-and-learn mindset takes time and effort. Leading companies are building a culture of growth
: one where people focus on using data and testing to optimize the customer experience day by day. Below are some of the key lessons learned as we work with teams embracing this growth mindset.
Get top-level support
When we first talk with customers, we insist a decision-maker attend our meetings. If there's no support from the top, all of our testing ideas could end up on the shelf collecting dust. Obviously, the marketing executive or CEO won’t have an a-ha
moment if you frame testing as a way to improve conversions. The trick is to show how testing impacts a business goal, such as revenue or, better yet, profit. Then the decision-maker will have an ohhh
moment: As in, “Ohhh, I knew this was important, but I didn’t think about how a small change could have such a big impact on our bottom line.”
Top-level support will help you get the resources you need and unlock the potential of people already working on experiments. The trend we see is typically one or two persons who start doing the optimizations. They are usually mid-level designers or data analysts who have an affinity for conversion rate optimization, but are often working in a silo.
On the other end of the spectrum, we see companies that have fully bought into the power of experimentation. Multiple customers even have a group of product managers who work on projects with a group of specialists, including a data scientist, copywriter, designer, and even a design psychologist.Tip: Look for these three types of people to jumpstart a culture of growth in your organization.
Prioritize, prioritize, prioritize
You can't test every idea at once. And prioritization should not be a guessing game.
When we surveyed a group of our EMEA advertisers at a conversion rate optimization event, 38% of the respondents said they use their gut or instinct to prioritize, while 14% allow the HiPPO (highest paid person’s opinion)
to call the shots.1
Instead, try using a framework that takes into account past lessons learned and resource requirements.Map test ideas in a speed-versus-impact grid
, and prioritize experiments that are quick to launch and likely to have the biggest impact. Keeping track of all prior test results is another way to ensure past learnings come into play when faced with a HiPPO.Tip: Start with ideas that will be simple to test and look like they could have the biggest potential impact.
Turn fairweather fans into engaged experimenters
Over time, as you share testing insights and achieve a few wins, more people will jump on board and you’ll need to train people on a repeatable testing framework
Testing is part of a cycle: What does the data tell you? Did the experiment succeed or fail for every user, or just for a specific segment of users? Analyze your test results, especially from failed experiments
, and use those insights to improve the customer experience across your touchpoints. And then conduct another test.
Just as important: How do you keep people excited and engaged in the process? Try using a shared document to invite everyone to submit their improvement suggestions for your website or app. You can even add gamification to this by keeping score of the most impactful ideas. Or, have people guess which test variation will win before you run the test. When you share the results, recognize or reward people who correctly predicted the winner.
Tip: Three ways to get your team engaged with testing and optimization
Feel good about failures
By its very nature, experimentation involves a lot of failure. A typical website might have 10 or 100 or even 1,000 things to test, but it might be that only a small percentage of those tests lead to significant, positive results. Of course, if that one winner leads to a 5% or 10% improvement in conversions, the impact on revenue can be enormous.
When we surveyed EMEA advertisers at one of our events, we found that companies running one to two tests a month had a 51% success rate. But for respondents who said they ran more than 21 tests a month, the success rate decreased to 17%.2
In the beginning, it’s easier to spot areas for improvement and “low-hanging fruit.” The more experiments you run, the more you’ll be focusing on smaller and smaller things. Then, the more you test, the less “successful” you will be. "Our test success rate is about 10%," says Jesse Nichols, Head of Growth at Nest. "But we learn something from all our tests
1-2 Source: Conversions@Google 2016 - State of CRO event attendee survey, 145 respondents, EMEA, September 2016.