Ever wonder what significant learnings the conversion rate optimization veterans have acquired from spending years in the trenches? Well, you’re about to find out. We reached out to our friends of world-renowned CRO and testing experts and asked them about the most important lessons they’ve gained from their optimizing experience. Here’s what they had to say:
Start with users and what they’re trying to do
“People who work on web sites tend to think about “web sites” and “pages,” but that’s backwards. We need to think about “people” and “tasks.”
A lot of that comes from old school SEO – thoughts like “this page is targeting the wrong keyword,” or an “exact match domain” isn’t pulling as much search weight as it used to. 2015 SEO isn’t like that anymore, really, but even if it were, it’s still a bad approach to web sites.
You need to start with users, and what they’re trying to do. If your questions start with what you can do for users and their tasks, you’re more likely to get pages and web sites right. If you start with pages, you’ll only ever be treating the symptom, not the disease.”
– Tim Ash | CEO, SiteTuners | Follow Tim on Twitter
Test strategies instead of elements
“One of the biggest challenges marketers face while optimizing is not being able to analyze and understand the results of their tests. The reason to this lies at the way people run tests. While running tests on call to action buttons, titles or other elements on a website it’s hard to break down the results and learn from them.
Even more importantly, the “a blue call to action beat a red one” helps in no way to scale the results and apply to other parts of your business. To really learn from tests and scale the results it’s crucial to test strategies. Each landing page for example should have it’s own strategy based on your target audience – different messaging, colors and structure. Doing this will ensure that once a variation wins you’ll be able to know WHY it won – which strategy works better and what you should optimize next.
Bottom line: Don’t test stuff automatically and without a plan – test concept, make big changes and dare. This will ensure you not only increase your conversion rates once – but continuously.”
– Talia Wolf | CEO, Conversioner | Follow Talia on Twitter
CRO is a process, not a list of tactics
“People have read too many “I changed this one word and got 4564% uplift” type of blog posts and tend to think optimization is just applying tactics – which is wrong.
Optimization is an ongoing, iterative process. The biggest part of it is the discovery of what matters. Tinkering with X might yield great results, while Y doesn’t matter. So how do you tell apart elements that matter from the rest? And if you identify what matters, what do you change them to / what do you test? That is the science of CRO.
It’s about asking the right questions, seeking out meaningful data (qualitative and quantitative) that might answer those questions and lead to insights which you can turn into test hypotheses. It’s not about button colors or magic words. It’s about the right process.”
– Peep Laja | Founder & Chief Conversion Architect, ConversionXL | Follow Peep on Twitter
Always accompany “What” with “Why”
“From headlines to CTAs to analytics reports, people generally care little about the “what” if the “why” remains unclear. If you want to continuously improve, it’s imperative to understand and harness the “why” reasoning that successfully drives results. Transform your features into benefits, your actions into value, and your data into insights.
I’ve found this applies not only to web content for products and services, but also to internal process steps, SOWs, and project proposals. “What” paired with “why” naturally paves a story that facilitates decision making for your customer, your team, and your business.”
– Angie Schottmuller | Director of Optimization, Three Deep Marketing | Follow Angie on Twitter
There is no such thing as a standard testing program
“The most important lesson we’ve learned from 10 years of helping clients optimize their web and mobile sites and apps is that there is no such thing as a standard testing program. Every company is different – its goals, strategies, tactics, hypotheses, audiences, product mix, etc.
That means the results seen by one client in one industry do not necessarily apply to other clients in other industries. But that’s okay as long as clients continue to “test-learn-repeat,” so that they learn what works for them specifically. This can be a hard lesson to learn for companies wanting quick answers, fast results, and impressive wins – but it’s an important thing to know.
Lots of campaigns aren’t going to yield positive results, and what’s significant about that is the fact that even if you don’t gain huge leaps in conversion, you might learn something that mitigates risk, is an incremental gain, or just keeps you from making a huge mistake. We consider those to be wins. What we like to say at SiteSpect is that the only failure in testing is a campaign in which you don’t learn anything. So keep on testing! Like most things in life, it’s not a “set it and forget it” activity—you’ve got to keep at it, keep improving, and keep learning. Good luck!”
– Eric Hansen | CEO, SiteSpect | Follow Eric on Twitter
You have to remove your own biases and opinions
“There are no shortcuts! You can’t just look at a landing page or a site and know how to optimize it, even if you know all the conversion principles. Sometimes variations lose even though they’re applying elements that normally increase conversions. This is because industries, target markets, and individual businesses are all different. What wins in 99% of tests is not guaranteed to win in yours.
You have to remove your own biases and opinions on how you like to browse or what you think is aesthetically pleasing, and delve into the minds of your target market to understand what makes THEM act. Then you have to analyze how they’re behaving on the site using multiple analytics tools. You also have to consider how they came to your site; was it from an ad? Organic search? Social media, email, or another site? People have different expectations coming from each channel and you have to make sure your landing page delivers on the promise made in the originating source.
You need a solid understanding of conversion theory, too. Use that, along with your knowledge of your target market and data from analytics, to create hypotheses and run meaningful tests. Some tests win and some lose. You need to learn from both and apply that learning to your next round of tests. Optimization is an ongoing process.
Act on data, not opinion. And take the time to do it right – it WILL pay off!”
Theresa Baiocco | Conversion Strategist, ConversionMax | Follow Theresa on Twitter
Couple persuasion psychology with strong analytics
“The deadliest sin in optimizing websites is not testing. This is obvious. You don’t want to be a sinner so you start testing. The button, the color, the headline, the button again. Now you’ve ended up in the second deadliest sin in optimization – Shotgun testing. What do you test? Well, what the competitor tested, what the CEO liked, what you read in some blog post, and so on. Most likely you will end up in what we call “The Testing Blues”. Disappointment. Blame game. Scaling down. – “This is not for us”
Or you scale up. You invest in tools, people, organization – whatever it takes to get a BIG testing machinery in place. And you set KPIs for your testing. The most important one being: “Number of test per months”. Someone higher up has decided that – “THIS IS IMPORTANT – WE MUST BE TESTING!” So people further down in the echelons gear up to answer their question: – “Are we testing? Yes we are. Look, we did 9 tests last month.”
I call this Alibi testing – the third deadliest sin in optimization. You are going through the testing motions, that’s all. What both Shotgun and Alibi testing lack is the structured process for discovery and prioritization of tests. You have under invested in your organization’s capability to find and decide WHAT to test. This is the cardinal sin of testing.
To fix it you need Ton and Bart. At the heart of one of the most successful testing organizations in the world – Online dialogue in the Netherlands – You will find a Web analytics ninja – Ton, and a passionate psychologist and “Chief persuasion officer – Bart. Go find you own Tons and Barts. Couple persuasion psychology with strong analytics. Prioritize your test for maximum business impact and see your testing results skyrocket.”
John Ekman | Founder, Conversionista | Follow John on Twitter
Test results often lie
“Test results often lie and it takes running a test properly, then deep diving into the data to get the truth. After a few hundred test I arrived that the only viable way to test was with a “window” where only new visitors were included at first then later eliminated after several full week buying cycles as the test “ran out”.
It took even longer to learn how to factor in cross-device realities and variances between traffic sources and segments. It should take you longer to report out on your test then it does to set it up. Don’t let the lines fool you.”
Keith Hagen | Co-Founder, ConversionIQ | Follow Keith on Twitter
Too much attachment won’t help your tests
“As some that ran over a hundred of experiments on our own website (Convert.com) over the last five years, one of the first learnings that I am on my way in accepting is that too much attachment to one variation (or group of variations) is not helping. More than once I got emotionally invested on one variation and it’s hard to see “my” variation – my baby that I thought would obviously win off the table – get inconclusive results or even lose completely to another variation.
The second learning is that service is hard to test. We win most deals by our service levels and we have not found a great way to A/B test that in wording or that it actually matters. It’s just what we keep hearing, but I have not had the heart to cut service for 50% of the trials to test that.”
Dennis van der Heijden | Founder & CEO, Convert.com | Follow Dennis on Twitter
You need to be really diligent
“Diligence. It’s important to always be testing, and continue to run experiments even if you’re not getting the results you’re hoping for. There will be losers. There will be stalemates. But there will also be winners. And once I get a winner I try to scale the winning idea as quickly as I can.”
Joe Weller | Senior Marketing Manager, SmartSheet