Tag Archives: Optimize 360

Deliver more relevant experiences with Optimize and AdWords

Search is one of the most important acquisition channels in a marketer’s toolkit. But it’s not enough to just optimize search ads. It’s essential to consider the entire customer journey and keep people engaged once they reach your site. That’s why we introduced an integration between Optimize and AdWords to make it easy for marketers to test and create personalized landing pages.

How Spotify boosted conversions with Optimize and AdWords


Spotify, one of the world’s leading audio streaming services, is just one example of a company that has successfully used the Optimize and AdWords integration to drive more conversions from their search campaigns. Spotify discovered that the most streamed content in Germany was actually audiobooks, not music. So they wanted to show German users that they have a wide selection of audiobooks, and also that the experience of listening to them is even better with a premium subscription. 

Using the AdWords integration with Optimize 360 (the enterprise version of Optimize), Spotify ran an experiment that focused on users in Germany who had searched for "audiobooks" on Google and clicked through on their search ad. Half of these users were shown a custom landing page dedicated to audiobooks, while the other half were shown the standard page. The custom landing page increased Spotify’s premium subscriptions by 24%. 

"Before, it was a fairly slow process to get all these tests done. Now, with Optimize 360, we can have 20 or more tests running at the same time. It’s important that we test a lot, so it doesn’t matter if we fail as long as we keep on testing,” said Joost de Schepper, Spotify’s Head of Conversion Optimization."

Watch Spotify’s video case study to learn more



Driving your own results

Today, we’re announcing three new updates to make it easier for all marketers to realize the benefits that Spotify saw from easily testing and creating more relevant landing pages:

1. Connect Optimize with the new AdWords experience

You can connect Optimize to AdWords in just a few steps. Follow these instructions to get started.

Not using the new AdWords experience yet? Make the switch to gain access to more actionable insights and faster access to new features.

2. Link multiple AdWords accounts at once

For advertisers that have many AdWords accounts under a manager account, individually linking each of those sub-accounts to Optimize can be time consuming.

Now, you can link your manager account directly to Optimize. This will pull in all your AdWords accounts at once, allowing you to immediately connect data from separate campaigns, ad groups, and more. To get started, switch to the new AdWords experience, and then you’ll see an option to link your manager account in your Linked accounts, learn more.

3. Gain more flexibility with your keywords

You can now run a single experiment for multiple keywords, even if they’re across different campaigns and ad groups. For example, test the same landing page for users that search for “chocolate chip cookies” in your “desserts” ad group and for users that search for “iced coffee” in your “beverages” ad group.

With the Optimize and AdWords integration, driving results through A/B testing is fast and simple. Sign-up for an Optimize account at no charge and get started today.

Happy Optimizing!


Test and Build for Mobile with Google Optimize

From buying new shoes to booking weekend getaways, mobile can make life more convenient for consumers — and create big wins for marketers. While 40% of consumers will leave a web page that takes longer than three seconds to load, 89% of people are likely to recommend a brand after a positive brand experience on mobile.1 That's why getting your mobile site in shape is more important than ever.

To create the seamless and responsive mobile site that consumers expect, you need the right tools, like Google Optimize. Optimize makes it easy to test different elements of your site to find the winning combination for the best mobile site possible. Now it’s even easier with our new responsive visual editor – and be sure to read on and learn how two of our clients found mobile success with Optimize 360, our enterprise version.

New! Preview your mobile site on any screen size 


While almost everyone has a mobile device, there are so many variations and screen sizes that it’s hard to take a one-size-fits-all approach to optimizing your mobile site. Now, once you’ve created your test page, you can use the new responsive editor to immediately preview what it looks like on any screen size. Or, if you want to see how it appears on a specific device, like a Nexus 7 or iPad, we’ve added more devices that you can select to preview. Learn more about the visual editor here.


Turn ideas to tests quickly 


The responsive visual editor in Optimize is just one solution to help marketers succeed on mobile. Our enterprise version, Optimize 360, makes it easy to make improvements to mobile sites efficiently and rapidly.

Dutch airline carrier Transavia Airlines turned to Optimize 360 to try out different ideas on its mobile site. In fact, the team runs about 10 A/B tests each month on the site, all without having to spend significant time or effort. And the best part? Time spent on analyzing the success of site tests has fallen by 50%. This allows Transavia to focus more on testing to improve its mobile site. Learn more in the full case study.

The path to mobile excellence starts with the customer journey 


Need some help determining what should test on your mobile site? Google Analytics 360 is a great place to start. You’ll be able to analyze any customer interaction, from search to checkout, to figure out which points of your purchase process need help. Then, once you’ve determined where your site needs work, using Optimize 360 to take action is simple, since it’s natively integrated with Analytics 360.

This is exactly how fashion retailer Mango used Analytics 360 and Optimize 360 to tackle its mobile site: After discovering that mobile visits to its online store had skyrocketed 50% year over year, Mango decided to dig a little deeper. In Analytics 360 Mango discovered that while many consumers browsed product listing pages, few were taking the next step to add products to their shopping cart. To reduce steps to checkout, Mango used Optimize 360 to include an “Add” button to product listing pages. This increased the number of users adding products to their carts by 49%. Find out more in the full case study.

Ready to optimize your own mobile site? 


Start testing new mobile experiences with the responsive visual editor in Optimize. This update is one that can help marketers do more on mobile — because whether it’s changing a button or fine-tuning a homepage with quick A/B tests, we’ve learned that small tweaks can make a big impact.

And, if you haven’t already, sign up for a free Optimize account and give it a try.

1 Google / Purchased: "How Brand Experiences Inspire Consumer Action" April 2017. US Smartphone Owners 18+ = 2010, Brand Experiences = 17,726.

Better A/B Testing with Firebase

Earlier this year, the Google Optimize and Firebase teams worked together to bring A/B testing functionality to Firebase. Last week, at the Firebase Dev Summit, we announced that A/B testing is now available in beta to all app developers.

This post originally appeared on The Firebase Blog.







Announcing Better A/B Testing with Firebase 


If you're like most app developers, you know that small changes can often make a big difference in the long term success of your app. Whether it's the wording that goes into your "Purchase" button, the order in which dialogs appear in your sign-up flow, or how difficult you've made a particular level of a game, that attention to detail can often make the difference between an app that hits the top charts, or one that languishes. 


But how do you know you've made the right changes? You can certainly make some educated guesses, ask friends, or run focus groups. But often, the best way to find out how your users will react to changes within your app is to simply try out those changes and see for yourself. And that's the idea behind A/B testing; it lets you release two (or more!) versions of your app simultaneously among randomly selected users to find out which version truly is more successful at getting the results you want. 


And while Firebase Remote Config did allow you to perform some simple A/B testing through it's "random percentile" condition, we've gone ahead and added an entirely new experiment layer in Firebase that works with Remote Config and notifications to make it quick and easy to set up and measure sophisticated A/B tests. Let's take a quick tour of how it works!


Getting to Know the New A/B Testing Feature 


With the new A/B testing feature, you can create an A/B test that will allow you to play with any combination of values that you can control through Remote Config. Setting up an A/B test allows you to define how the experiment will behave in a number of different ways, including determining how many of your users are involved with the experiment at first…


…how many variants you want to run, and how your app might behave differently for each variant…


...and what the goal of the experiment is.


Different experiments might have different desired goals, and A/B testing supports a number of common outcomes, like increasing overall revenue or retention in your app, reducing the number of crashes, or increasing the occurrence of any event you're measuring in Google Analytics for Firebase, such as finishing your in-app tutorial.

Once you've defined your A/B test, Firebase takes over by delivering these different variations of your app to randomly-selected members of your audience. Firebase will then measure your users' behavior over time, and let you know when an experiment appears to be performing better, based on those goals you've defined earlier. Firebase A/B testing measures these results for you with the same Bayesian statistical models that power Google Optimize, Google's free testing and personalization product for websites.

Using A/B Tests for Better Onboarding: A Case Study 


Fabulous, a motivational app for building better habits, recently made improvements to their app's onboarding by using Firebase A/B testing. When the user first starts an app, Fabulous shows them how to complete a habit, presents them with a letter about forming better habits, and then asks them to commit to a simple routine. The team suspected that if they removed a few steps from this onboarding process, more people might complete it.


Some of the screens a typical user encounters when first using Fabulous.
 
So they ran an A/B test where some users didn't see the letter, others didn't see the request to commit to a simple routine, and others skipped both of those steps. The Fabulous team found that by removing both of these steps from the onboarding process, there was a 7% improvement in the rate of users completing the onboarding flow. More importantly, they confirmed that this shorter onboarding experience didn't have any impact on their app's retention.

Test Your Notifications, Too! 


You also have the ability to A/B test your app notification messages through the Firebase Notifications console. You can try out different versions of your notification message and see which ones lead to more users opening up your app from that notification, or which messages lead to users performing some intended goal within your app, like making a purchase.

Getting Started 


A/B testing is available in beta to all Firebase developers starting today. If you're excited to get started, you should make sure that your app is wired up to use Remote Config and/or Firebase Cloud Messaging, and that you've updated these libraries to the latest and greatest versions. You can always find out more about A/B testing in our documentation, or check out the A/B Test Like a Pro video series we've been building.

Then, head on over to the Firebase Console and start making your app better — one experiment at a time!


Google Optimize now offers more precision and control for marketers

Savvy businesses review every step of the customer journey to ensure they are delivering the best experience and to find ways to offer more value. Today, we’re releasing two new features that will make it easier for you to improve each of those steps with the help of Google Optimize and Optimize 360.

AdWords integration: Find the best landing page 


Marketers spend a lot of time optimizing their Search Ads to find the right message that brings the most customers to their site. But that's just half the equation: Sales also depend on what happens once people reach the site.

The Optimize and AdWords integration we announced in May gives marketers an easy way to change and test the landing pages related to their AdWords ads. This integration is now available in beta for anyone to try. If you’re already an Optimize user, just enable Google Optimize account linking in your AdWords account. (See the instructions in step 2 of our Help Center article.) Then you can create your first landing page test in minutes.

Suppose you want to improve your flower shop's sales for the keyword “holiday bouquets.” You might use the Optimize visual editor to create two different options for the hero spot on your landing page: a photo of a holiday dinner table centerpiece versus a banner reading "Save 20% on holiday bouquets." And then you can use Optimize to target your experiment to only show to users who visit your site after searching for “holiday bouquets.”

If the version with the photo performs better, you can test it with other AdWords keywords and campaigns, or try an alternate photo of guests arriving with a bouquet of flowers.

Objectives: More flexibility and control 


Since we released Optimize and Optimize 360, users have been asking us for a way to set more Google Analytics metrics as experiment objectives. Previously,
Optimize users could only select the default experiment objectives built into Optimize (like page views, session duration, or bounces), or select a goal they had already created in Analytics.

With today's launch, Optimize users no longer need to pre-create a goal in Analytics, they can create the experiment objective right in Optimize:


Build the right objective for your experiment directly in the Optimize UI.

When users build their own objective directly in Optimize, we’ll automatically help them check to see if what they’ve set up is correct.

Plus, users can also set their Optimize experiment to track against things like Event Category or Page URL.

Learn more about Optimize experiment objectives here.

Why do these things matter? 


It's always good to put more options and control into the hands of our users. A recent study showed that marketing leaders – those who significantly exceeded their top business goal in 2016 – are 1.5X as likely to say that their organizations currently have a clear understanding of their customers' journeys across channels and devices.1 Testing and experimenting is one way to better understand and improve customer journeys, and that's what Optimize can help you do best.

>>> Check out these new features in Optimize now<<<


1Econsultancy and Google, "The Customer Experience is Written in Data", May 2017, U.S.

Sigma Sport spins up 28% higher revenue with Google Optimize 360

If you’re a road cyclist or triathlete, chances are you know Sigma Sport. This global retailer sells bikes, clothing, energy bars, anti-chafe balm and everything else you need to power your way through your next big event — or just enjoy your next friendly ride in the country.

Recently Sigma Sport set out to address a vital need: to find more customers with high potential lifetime value. “Growth with high-value customers is key to our success,” says Nik Hill, the company’s Head of Digital. “We knew we needed to change our website experience to better engage these customers.”

To reach its goal, Sigma Sport turned to its agency, the digital conversion specialists Merkle | Periscopix. And together they turned to Google Optimize 360, part of the Google Analytics 360 Suite.

Using Optimize 360, Merkle | Periscopix created an experiment where they replaced Sigma Sport’s homepage carousel with brand-specific images of the site’s three top-performing brands: Castelli, Specialized, and Assos. Then they targeted the experiment to the audiences they had already built in Analytics.

This allowed Merkle | Periscopix to serve personalized experiences to fans of each brand. “We used the Analytics audience targeting feature in Optimize 360 to serve bespoke experiences to subsets of users,” says Shahina Meru, Associate Analytics Lead at Merkle | Periscopix. “We created three distinct Analytics audiences who had earlier bought or interacted with the top three brands, then used these as targeting rules in Optimize 360. Anyone who had looked at or bought a Specialized bike in the past, for instance, now saw Specialized products in their carousel.”


When Sigma Sport tested its new personalized home page, they saw right away that it was a hit with users. The experiment drove a 28% rise in revenue and a 32% increase in e-commerce conversion rate during the experiment. In fact, Sigma Sport saw uplift across the entire customer journey with a 90%+ probability to beat the baseline.

The bottom line: Personalization worked, both for bike-shopping customers and for Sigma Sport. Now Merkle | Periscopix is looking for more ways to enhance user experience with personalization from Analytics and Optimize 360.


Why Your Testing and Optimization Team Needs a Data Storyteller

If a test happens on your website and nobody hears about it, does it make a sound?

Not to get too philosophical, but that's one of the big challenges of building a culture of growth and optimization: getting the word out. That's why a data storyteller is one of the key members of any testing team.

In fact, “communication and data storytelling” was noted as a critical skill for a person who leads testing and optimization efforts, according to a survey of marketing leaders who conduct tests and online experiments.1 The must-have skills rounding out the top three were leadership and, the obvious, analytics.



A data storyteller is part numbers-cruncher, part internal marketer, and part ace correspondent from the testing trenches. He or she is someone who can take the sheer data of testing — the stacks of numbers, the fractional wins and losses, the stream of daily choices — and turn it into a narrative that will excite the team, the office, and (especially) the C-suite.

Storytelling doesn't just mean bragging about successes. It can also mean sharing failures and other less-than-optimal outcomes. The point is not just to highlight wins: it's to reinforce a culture of growth, to generate interest in experimentation, and to explain why testing is so good for the company.

"Our test success rate is about 10%," says Jesse Nichols, Head of Growth for Nest. "We learn something from all our tests, but only one in 10 results in some kind of meaningful improvement." That means that a big part of the data storyteller's job is to keep people interested in testing and show them the value.

Watch our on-demand webinar “Test with success — even when you fail” to hear more testing and optimization tips.


If you're the data storyteller for your team, here are three points to remember:
  • Take the long view.  Gaining support for testing is like rolling a rock up a hill: slow going at first, but once you cross the summit the momentum will take over fast. It takes time, so lay the groundwork with lots of short reports. Don't wait to make formal presentations: Look for chances to drop your message into weekly wrap-ups and other group forums. In short, don’t be afraid to over-communicate. 
  • Be specific. It's better to present one great number than 10 so-so ones. Think mosaic rather than mural: Look for specific stories that can represent your larger efforts and broader plans. 
  • Keep your eye on the bottom line. In the end, that's what it's all about. You may be thrilled that a call-to-action change from "see more" to "learn more" increased clicks by .03%, but what will really get the CMO and other executives interested is moving the profit needle. As a litmus test, ask yourself, “So what?” If your story doesn’t clearly answer the question in terms the audience cares about, consider giving it a rewrite. 
And remember that it won't always be small victories. "The things you're so sure are going to work are the ones that go nowhere," says Jesse. "Then you do a throwaway test and it makes the company an extra $500,000." That's a story that everyone will want to hear.


Download our eBook How to Build a Culture of Growth to learn more best practices on testing and optimization.


1Source: Google Surveys, U.S., "Marketing Growth and Optimization," Base: 251 marketing executives who conduct A/B tests or online experiments, Oct. 2016.

Lessons Learned: Testing and Optimization Tales from the Field

Max van der Heijden is a user experience and conversion specialist at Google who works with companies across Europe, the Middle East, and Africa. Max shares his thoughts about how companies can build a culture of growth and experimentation.


How many times have you launched new features or page designs on your website without testing first?

In an ideal world, companies should test everything before rolling out site changes. But some websites have too little traffic to generate credible results from experiments, and some bugs should just be fixed if they prevent users from achieving their goal. At the very least, analyze your analytics data and use qualitative methods such as user tests and surveys to validate any improvement ideas you have before implementing. If you have the traffic volume: Test!

I’m part of a team at Google that works with advertisers to identify opportunities for improving website user experiences through experiments and testing roadmaps. When our team of UX specialists begins consulting with a new company, the first three things I tell them are:

  1. The possibilities for improvement are enormous. Even if an experiment increases your conversion rate by “just 5%,” you can calculate the positive effect on your revenue.
  2. What works for one may not work for all. No matter how many times we have seen recommendations or “best practices” work on other — maybe even similar — websites, that does not mean it will work for your users or your business.
  3. Expect failures — and learn from them. Testing takes time, and it's hard to know which tests will pay off. Embrace failures and the lessons learned from them.

Making the switch from “get-it-live” mode to a test-and-learn mindset takes time and effort. Leading companies are building a culture of growth: one where people focus on using data and testing to optimize the customer experience day by day. Below are some of the key lessons learned as we work with teams embracing this growth mindset.

Get top-level support

When we first talk with customers, we insist a decision-maker attend our meetings. If there's no support from the top, all of our testing ideas could end up on the shelf collecting dust. Obviously, the marketing executive or CEO won’t have an a-ha moment if you frame testing as a way to improve conversions. The trick is to show how testing impacts a business goal, such as revenue or, better yet, profit. Then the decision-maker will have an ohhh moment: As in, “Ohhh, I knew this was important, but I didn’t think about how a small change could have such a big impact on our bottom line.”

Top-level support will help you get the resources you need and unlock the potential of people already working on experiments. The trend we see is typically one or two persons who start doing the optimizations. They are usually mid-level designers or data analysts who have an affinity for conversion rate optimization, but are often working in a silo.

On the other end of the spectrum, we see companies that have fully bought into the power of experimentation. Multiple customers even have a group of product managers who work on projects with a group of specialists, including a data scientist, copywriter, designer, and even a design psychologist.

Tip: Look for these three types of people to jumpstart a culture of growth in your organization.

Prioritize, prioritize, prioritize

You can't test every idea at once. And prioritization should not be a guessing game.

When we surveyed a group of our EMEA advertisers at a conversion rate optimization event, 38% of the respondents said they use their gut or instinct to prioritize, while 14% allow the HiPPO (highest paid person’s opinion) to call the shots.1 Instead, try using a framework that takes into account past lessons learned and resource requirements.

Map test ideas in a speed-versus-impact grid, and prioritize experiments that are quick to launch and likely to have the biggest impact. Keeping track of all prior test results is another way to ensure past learnings come into play when faced with a HiPPO.

Tip: Start with ideas that will be simple to test and look like they could have the biggest potential impact.


Turn fairweather fans into engaged experimenters

Over time, as you share testing insights and achieve a few wins, more people will jump on board and you’ll need to train people on a repeatable testing framework.

Testing is part of a cycle: What does the data tell you? Did the experiment succeed or fail for every user, or just for a specific segment of users? Analyze your test results, especially from failed experiments, and use those insights to improve the customer experience across your touchpoints. And then conduct another test.

Just as important: How do you keep people excited and engaged in the process? Try using a shared document to invite everyone to submit their improvement suggestions for your website or app. You can even add gamification to this by keeping score of the most impactful ideas. Or, have people guess which test variation will win before you run the test. When you share the results, recognize or reward people who correctly predicted the winner.
Tip: Three ways to get your team engaged with testing and optimization

Feel good about failures

By its very nature, experimentation involves a lot of failure. A typical website might have 10 or 100 or even 1,000 things to test, but it might be that only a small percentage of those tests lead to significant, positive results. Of course, if that one winner leads to a 5% or 10% improvement in conversions, the impact on revenue can be enormous.

When we surveyed EMEA advertisers at one of our events, we found that companies running one to two tests a month had a 51% success rate. But for respondents who said they ran more than 21 tests a month, the success rate decreased to 17%.2

In the beginning, it’s easier to spot areas for improvement and “low-hanging fruit.” The more experiments you run, the more you’ll be focusing on smaller and smaller things. Then, the more you test, the less “successful” you will be. "Our test success rate is about 10%," says Jesse Nichols, Head of Growth at Nest. "But we learn something from all our tests."

Download the guide How to Build a Culture of Growth to learn more about best practices for testing and optimization.

1-2 Source: Conversions@Google 2016 - State of CRO event attendee survey, 145 respondents, EMEA, September 2016.

‘All Killer, No Filler’: The Next Web finds the right message with Google Optimize 360

In a world where consumer behavior can shift on a dime, marketers constantly ask themselves: How can we be more useful to our customers? With all the data businesses collect, the challenge becomes tuning out the noise to focus on insights your team can act on.

Today’s most successful businesses have turned to a new approach: building a culture of growth and optimization. This is where everyone in an organization is using data to test and learn as a means to improve the customer experience every day.

The Next Web, a technology-media company and online publisher, has embraced this testing culture and turned to Google Optimize 360 to help them find just the right message to drive readers to their conference website.

The Next Web Case Study 


The Next Web’s conferences bring tech leaders, entrepreneurs, and marketers together to innovate, share, and look ahead. The first TNW conference was created in 2006 by Patrick de Laive and Boris Veldhuijzen van Zanten, when they couldn’t find the kind of event they needed to showcase their own startup.

That first event drew a respectable 280 attendees, but the founders knew they needed a better way to promote future TNW conferences. That’s when they launched thenextweb.com, a tech news and culture website that today attracts 8 million users a month. The Next Web’s two annual conferences in New York City and Amsterdam now draw over 20,000 attendees.

The Next Web’s marketing team uses promotional messages within articles on thenextweb.com to drive potential attendees to the conference website and sell tickets. To find out which combination of messages works best, they used Google Optimize 360, an integrated part of the Google Analytics 360 Suite.


"We want more people to read content on thenextweb.com as a first step," says Martijn Scheijbeler, who leads the marketing team's efforts. "If we can convince them to become a loyal user, we can try to interest them in different opportunities. In the end, we’d like them to join us at one of our events to experience what The Next Web is really about." 

With one of its conferences coming up, The Next Web's marketing team wanted to compare different headlines and descriptions to see which combination would drive more readers to its conference page. Using Optimize 360, The Next Web team ran a multivariate experiment to discover the combinations that worked best.


For The Next Web, the results were clear: The "All Killer, No Filler" headline with the "This one's different, trust us" description was the winner. During the experiment it performed 26.5% better than the original headline and description, with a 100% probability to beat baseline.

Today The Next Web team tests and optimizes its conference messages day by day. Better messaging means more traffic to The Next Web conference site, and that means more attendees. It also gives the marketing team extra wins like higher awareness and more newsletter signups.

“Optimize 360 and Analytics 360 make testing easy for us,” Martijn says. “They give us much better insights into how many clicks we’re getting for each message. We’re reaching more people who want to come to our conferences, and those better results are going right to our bottom line.”


For more, read the full case study with The Next Web.


What does a good website test look like? The essential elements of testing

"Test! Test! Test!" We've all heard this advice for building a better website. Testing is the heart of creating a culture of growth ― a culture where everyone on your team is ready to gather and act on data to make the customer experience better day by day.

But how do you run a good test? Is it just a matter of finding something you're not sure about and switching it around, like changing a blue "Buy now" button for a red one? It depends: Did you decide to test that button based on analytics, or was it a wild guess?

Assuming the former, a good test also means that even if it fails, you’ve still learned something. A bad test may make your website performance worse than before, but it’s even worse if you don’t take those learnings into account in the future.

The key to running good tests is to establish a testing framework that fits your company.

Join us for a live webinar on Thursday, March 9, as Krista Seiden, Google Analytics Advocate, and Jesse Nichols, Head of Growth at Nest, share a six-step framework for testing and building better websites.

Frameworks vary from business to business, but most include three key ideas:

Start with an insight and a hypothesis.
A random "I wonder what would happen if …" is not a great start for a successful test. A better way to start is by reviewing your data. Look for things that stand out: things that are working unusually well or unusually badly.

Once you have an insight in hand, develop a hypothesis about it: Why is that element performing so well (or so badly)? What is the experience of users as they encounter it? If it's good, how might you replicate it elsewhere? If it's bad, how might you improve it? This hypothesis is the starting point of your test.

For example, if you notice that your mobile conversion rate was less than on desktop, you might run tests to help you improve the mobile shopping or checkout experience. The team at The Motley Fool found that email campaigns were successfully driving visitors to the newsletter order page, but they weren’t seeing the conversions. That led them to experiment on how to streamline the user experience.

Come up with a lot of small ideas.
Think about all the ways you could test your hypothesis. Be small-c creative: You don't have to re-invent the call-to-action button, for instance, but you should be willing to test some new ideas that are bold or unusual. Switching your call-to-action text from "Sign up now" to "Sign up today" may be worth testing, but experimenting with "Give us a try" may give you a broader perspective.

When in doubt, keep it simple. It's better to start with lots of small incremental tests, not a few massive changes. You'll be surprised how much difference one small tweak can make. (Get inspiration for your experiments here.)

Go for simple and powerful.
You can't test every idea at once. So start with the hypotheses that will be easy to test and make the biggest potential impact. It may take less time and fewer resources to start by testing one CTA button to show incremental improvement in conversion rates. Or, you may consider taking more time to test a new page design.

It may help to think in terms of a speed-versus-impact grid like this. You don't want quiet turtles; the items you're looking for are those potential noisy rabbits.


The best place to begin a rabbit hunt is close to the end of your user flow. "Start testing near the conversion point if you can," says Jesse Nichols, Head of Growth at Nest. “The further you go from the conversion point, the harder it gets to have a test that really rocks — where the ripple effect can carry all the way through to impact the conversion rate,” says Jesse.

Stick with it
A final key: Test in a regular and repeatable way. Establish an approach and use it every time, so you can make apples-to-apples comparisons of results and learn as you go.

A clear and sturdy framework like this will go a long way toward making your team comfortable with testing — and keeping them on the right track as they do.

Download the eBook How to Build a Culture of Growth to learn more about best practices for testing and optimization.

Why Building a Culture of Optimization Improves the Customer Experience

How can we be more useful to our customers today?

That's the simple question that drives any marketing organization focused on testing, improvement, and growth.

But answering the question is not always so simple in our data-rich world. The old challenge of gathering enough data has been replaced by a new one: gleaning insights from the mountains of data we’ve collected — and taking action.

In response to this flood of data, many of today's most successful businesses have turned to a new approach: building what's called a culture of growth and optimization.

This growth-minded culture is one where everyone is ready to:
  • Test everything 
  • Value data over opinion 
  • Keep testing and learning, even from failures 
Most companies have a few people who are optimizers by nature, interest, or experience. Some may even have a “growth team.” But what really moves the dial is when everyone in the company is on board and embraces the importance of testing, measuring, and improving the customer experience across all touchpoints.
"We refuse to believe that our customers’ experiences should be limited by our resources." - Andrew Duffle, Director of Analytics, APMEX
Why should marketers care?
Because they'll be leading the revolution. 86% of CMOs and senior marketing executives believe they will own the end-to-end customer experience by 2020, according to a recent survey from the Economist Intelligence Unit.1 And a culture of growth and optimization offers an excellent path to major gains in those experiences.

As testing and optimization proves itself, it tends to generate higher-level investments of support, talent, and resources. The payoff arrives in the form of more visitors, more sales, happier customers and a healthier bottom line.

If you're curious about building a culture of optimization in your marketing organization, register for our Nov. 10 webinar, Get Better Every Day: Build a Marketing Culture of Testing and Optimization.

This webinar will cover:
  • The critical elements of a culture of optimization 
  • Tips for building that culture in your own company 
  • A case study discussion with Andrew Duffle, Director of Analytics at APMEX, a retailer that boosted revenues with continuous testing and optimization 
This kind of culture doesn't happen by command, but it’s also simple to start building.

We look forward to sharing tips on how you can get started. Happy optimizing!


  1. The Economist Intelligence Unit, "The Path to 2020: Marketers Seize the Customer Experience." Survey and a series of in-depth interviews with senior executives. Survey base: 499 CMOs and senior marketing executives, global, 2016.