May 28, 2020

Most CRO advice is misleading and here is the data to prove it


Vantages

GO tapped into CRED, the world's largest database of experimentation results, to see how the free advice performed

The CEO of the British grocery retailer, Ocado, said he thought the business was being hacked as their servers struggled to deal with the amount of traffic in March this year. But rather than a malicious attack, it was swathes of customers all rushing to place food orders as news of Covid-19 hit. By the time U.K. Prime Minister Boris Johnson had finished his announcement of a national lockdown, Ocado had booked three weeks’ worth of delivery slots in just under one hour. 1

This story of rapidly changing consumer behavior has played out over the world, driven by imposed lockdowns and reaction to Covid-19. But it’s a tale of two halves. While some sectors have boomed, others are trying to stave off the very real risk of financial armageddon. Both scenarios have led organizations to grapple with how best to make improvements to their digital experiences – either to capitalize on the large influx of customers or to eke out every last potential sale.

Where does one start? A quick Google search on “easy CRO wins” reveals 940,000 results. Is all the free advice right? 

We tapped into GO’s centralized repository of experimentation data (CRED), which consists of over 33,075,792,035 data points from past experiments, to see how the advice stacks up against the data 2 The bottom line: some of it has merit, the rest is so contextual that it could be worthless to your organization. 

While many organizations just need to start making customer-centric changes today, it’s only when digital experimentation is connected to an executive-approved, business purpose do customer insights yield transformative revenue growth.

Looking for a win to bolster your program? Here’s what we found works.

Companies should start testing at the bottom of the funnel? Right?

Wrong. The data says no. The higher up the funnel, the higher the win rate.

Many CRO practitioners often argue about where’s best to focus experimentation: the top or the bottom of the funnel. To put the argument to rest, CRED data clearly says the top is best when looking for a quick win.  

CRED found that tests on carts and checkout pages only generated a significant winner 26% of the time, well below the 32.2% average win rate. 

Companies get a slightly better win rate of 34.4% with experiments on product detail pages, and 41.8% on product listings pages. 

Generally, the further up the funnel you go the more likely you are to get a winning outcome. 

Graphic showing experimentation win rates by page type

The top of the funnel is where most consumers land on your site. It’s where you build a consumer’s motivation to take further action into the journey.  “If it’s difficult to navigate or it’s hard to find an attractive product at this point, customers will drop off early,” says Alex Mason, an experimentation strategist at Widerfunnel. 

In one experiment with a national sporting goods retailer, Widerfunnel added in-depth quick links to the site’s homepage. Strategists hypothesized that by showing customers a menu of products early on in their journey, it would improve browsability and lead to a higher conversion rate. “Not only did this result in a significant increase in engagement, but dramatic lifts in visits all the way down the funnel to transactions,“ says Mason.

Two versions of a website's homepage navigation
In a winning experiment, the experimentation agency, Widerfunnel, introduced quick links to a sporting goods retailer’s website early in the customer journey e.g. the homepage. The winning variation led to a dramatic, positive business impact. (Mobile versions not shown.)

Up-sell and cross-sell tests yield quick wins. Correct?

The data says yes, especially at check out.

Suggesting to a customer that a pair of matching earrings will really set off ‘the look’ with the necklace they’re purchasing is a good call. And CRED backs this up with cross and up-selling experiments performing at a 50% win-rate.

Manuel Heeg, a CRED data scientist at GO Group Digital, found something interesting in the data. If you are experimenting with up and cross-sells you’re more likely to get a winning outcome if the test is between the cart and checkout, such as a pop up after ‘add to cart’, rather than where you might typically see these features on the product detail page. 

Heeg suggests that to find the quick wins you’ll need to consider the customers’ internal motivations, “If a customer has added a product to their cart, they are showing a high motivation to purchase and thus are more likely to positively respond to cross-sells at this point in their journey. If a customer is still browsing products then they have a lower motivation, so the addition of cross-sell products might trigger the paradox of choice – where customers get turned off by too many options,” says Heeg.

Generally, the further up the funnel you go the more likely you are to get a winning outcome. 

What’s important if you’re looking at up-sells, is to consider the price of the add-ons. “You ideally want any upsells to be not that much more expensive – relative to the cost of what’s already in a consumer’s basket. This is because your customers will be using the cost of what’s already in their cart as an ‘anchor’ or frame of reference to decide if it’s a good value or not,” says Heeg. 

But don’t use price as the only consideration, warns Heeg. Ocado, for example, recommends groceries that complement items already in the customer’s basket. “It’s this kind of customer-centric thinking that delivers wins,” explains Heeg. 

Where cross and up-sell tests win best

At cart and/or checkout Product detail page
Average win rate 50% 20%
CRED (April 2020) Global e-commerce retail business models. Rates vary for subscription and lead-generation.

If you want a win, then add social proof. 

Not true. The value of social proof is contextual. 

You’ll often see CRO articles and blogs touting social proof as the obvious psychological tool to produce a quick win. Following advice like this can cost your business dearly.

“We ran a social proof experiment for a financial services firm that found that by adding social proof to an email signup, it actually decreased signups by 11.2%. We probed the data and discovered that by adding the proof at such an early stage in the customer journey it implied that everyone already knew about ‘this great stock tip’” says Mason from Widerfunnel. “That target audience wanted to find stock tips before anyone else, hence, the test loss.” 

In earlier stages of the journey, customers were not yet in a purchase state of mind and craved exclusivity. But when the customer was exposed to a purchase decision, they responded positively to social proof that reduced anxiety and increased trust and confidence in their decision. 

Yet, in another experiment for the same financial services firm, when Widerfunnel tested removing social proof from the order page, transactions dropped. “We have to remember that social proof is contextual,” says Mason. “In earlier stages of the journey, customers were not yet in a purchase state of mind and craved exclusivity. But when the customer was exposed to a purchase decision, they responded positively to social proof that reduced anxiety and increased trust and confidence in their decision.” 

This highly contextual nature of social proof is the likely reason why our research found tests that rely on the psychology pattern are not very successful. Instead, other principles of persuasion seem to be less affected by context and deliver higher win rates. 

Psychological principle 

Win rate
Offering something for ‘free’  58.3%
Reciprocity 58%
Clarity 50%
Trust and security 46%
CRED (April 2020) Global data from all business models. 

There are some great examples of these principles in practice right now. Take grocery retailers, who are using reciprocity by offering priority slots to the elderly and vulnerable, building up the desire for customers to return the positive action with another positive action such as brand loyalty. Or Covid-19 messaging around contactless deliveries to increase trust. 

Many of these principles add value to your customers at no cost to your business.

Is testing your value proposition going to get you positive results?  

Yes, but product information tests have a higher win rate.

Our data confirmed that testing unique value propositions (UVPs) such as, showcasing free delivery or highlighting award-winning customer service, are likely to be winners. Besides winning, these sorts of tests can take little effort to implement. Our data found that optimizing your product information and CTA’s could be even more effective. 

Where copy testing deliver results 

Unique value propositions (UVPs) tests Call to action (CTA) tests   Product information tests
Average Win Rate 45.7% 47.9% 48.3%
CRED (April 2020) Global data from all business models. 

Despite the acronym, your UVP’s don’t always need to be unique. Stephen Pavlovich, CEO of Conversion points out, “When testing your UVP, make sure that you prove it. Anyone can say that they have great customer service, prices, delivery… But how would you prove it?” Pavlovich adds, “We worked with a furniture retailer and found that delivery was a priority for customers. Not just delivery cost and speed, but also how the goods were handled, where they’d be dropped off, and how careful the couriers would be in the customer’s home. So we filled the delivery page with details about how the company had tested multiple couriers to find the ones who were most reliable, then explained every detail of the delivery process to reassure the customer. The test was a huge success.”

If consumers aren’t seeing the right value proposition at the right point in the journey, it can cause them to break away from the purchase flow to hunt down the information.  It’s best to test different messaging at different points in the journey, considering the mindset of the customer at each stage, says Pavlovich. 

Is it worth fixing bad usability?

The data says yes. 

This is the best quick win out of all the best-practice advice we reviewed. Improving usability shows a 60% success rate of generating a winning result. In most cases, however, you don’t even require a test. Before you dive headfirst into full heuristic evaluations, the best performing usability quick wins come from major issues, not slight annoyances or tweaks that only UX experts would pick up on. 

If you have a problem or something is broken on your site that is preventing customers from completing any of their main goals, fix it now. It’s much more likely to have a positive impact on conversion than any other optimization tactic. 

Customer insights, not “winning,” should be the byproduct of a great experimentation strategy

When you speak with fast-growth companies, they do not talk about winners or losers in terms of testing, instead, they refer to how well and broadly can they discover and action customer insights across their entire organization. 

To build executive buy-in, most teams still “need to win” to prove their worth, especially during disruptive times such as the Covid-19 crisis. If you’re in this position, consider working your way down this abridged list of data-backed CRO guidance. The last thing anyone can afford now is to not experiment with customer-centric, data-driven business evolutions i.e. tests. As Tim Steiner, CEO of Ocado says, “’The biggest risk in business is not taking enough risk. Most companies are too frightened, but if you stand still it is a certainty you are going to be extinct.”3

Compare your company’s experimentation performance against GO Group Digital’s CRED benchmarks. Get in touch. We’re happy to share benchmarks on lift, ROI and more per industry, business model and country.

  1. This Is Money UK (March 19, 2020) Hacking fear as Ocado orders soar
  2. GO Group Digital (May 26, 2020) Global e-commerce query
  3. This Is Money UK (September 2019) It was a gamble. I had no idea how hard it would be.

Got Questions?

Reach Out to Us

About the GO Group

The world’s leading experimentation experts build global experimentation programs and solutions for the GO Group. Contact the Group to learn how its experience and international setup can build or revitalize your experimentation program. The GO Group operates at the intersection of consultancy and conversion, enabling its enterprise clients to unlock business growth and value through the power of experimentation.

Learn more about the GO Group at www.gogroupdigital.com

Copyright © 2019 GO Group Digital. All rights reserved.

Close

Ready to go evolutionary to epic?

reCAPTCHA is required.