In this session, Shiva shares insights from his experience of running conversion rate optimization programs for the past several years. He talks about collaboration, how you can navigate the politics of experimentation, testing to learn and not win, and much more.
Cro webinar what you're doing wrong in your cro program (sharable version)
1. What You're Doing Wrong
In Your CRO Program
Shiva Manjunath, CRO Guru
2. Hi!
I’m Shiva
I’m a CRO Program Manager at Gartner
I’ve been running CRO programs for 5+ years
across B2B and B2C
I make CRO memes to open conversations
about experimentation 2
3. What I want to talk about
3
I want to talk to you about not just running
experiments, but optimizing your CRO
program for maximum efficiency
4. Always Be Testing Data Collection Politics
Agenda
Value of
Experimentation
Collaboration Test to Learn
9. Value of Experimentation
- It’s not BCO (Button Click Optimization)
- It’s not always ‘Conversion Rate Optimization’
- It’s experimentation to optimize the customer journey
- I call it Customer Experience Optimization
- Yes. I’m a CEO
- It’s risk mitigation
9
11. Experiments Are Complicated To Run
11
- The visual editor also simplifies the
experimentation process
- Experimentation involves… a lot
- Marketing/Brand approval
- Design
- Research
- ...
13. Collaboration
13
- The whole is greater than the sum of the parts
- It’s a mutualistic relationship (both benefit)
- You need them just as much as they need you
15. Collaboration With UX
15
- UX research gives you valuable data
- Experimentation is data (when you test to learn)
- Tips for Collaboration:
- Weekly syncs to loop them in on test updates,
and they update you on UX projects
- Use UX research as data for your experiments
- Validate UX by running their designs through
experiments
17. Collaboration With Engineers
17
- Mitigate risk in experiment/code rollout overlaps
- Run more compelling tests client side
- Tips for Collaboration:
- Weekly syncs to loop them in on test updates,
and they update you on dev projects
- Identify potential overlap with code
rollout/experiment breakages
- Identify more creative ways to run more
interesting tests based on your testing roadmap
18. Collaboration With Brand/Marketing
18
- CRO can verify (and support) marketing efforts
- CRO needs to be within brand standards to
create a unified customer journey
- Tips for Collaboration:
- Bi-weekly syncs to loop them in on test
updates, and they update you on
marketing efforts
- Partner to identify ways to ‘test to learn’
more about audience to feed data into
their marketing efforts
20. “I have not failed. I've just
found 10,000 ways that won't
work.”
20
—Thomas A. Edison
21. “Test to learn, not test to win. If
you’re testing to learn, you will
win far more than if you just
test to win”
21
—Shiva Manjunath
22. What Does ‘Test to Learn’ Mean?
22
- It means never ‘losing’
- You just paid for a learning
- It means actually winning more
- Compounding learnings = higher chance of
winning
- It means having specific winning concepts to
communicate to brand/UX
- Iterating off these learnings is far easier due to the
learnings achieved from each test
23. Test to Learn vs Test to Win
23
Hypothesis: We believe a video of our
product in use is valuable for our users
in the middle of the product page
We believe the video will be most helpful
by putting the video in the middle of the
product page
Hypothesis: We believe a video of our
product in use is valuable for our users
in the middle of the product page
We don’t even know if the video is
engaging to our users - we will put the
video above the fold on the product page
Test to Win Test to Learn
24. Test to Learn vs Test to Win
24
Hypothesis: We believe a video of our
product in use is valuable for our users
in the middle of the product page
We don’t even know if the video is
engaging to our users - we will put the
video above the fold on the product page
Test to Learn
This is better because:
- You will learn very quickly how
your audience reacts to the
video (good or bad)
- You can iterate from here!
25. How Do You Start ‘Testing to Learn’?
25
- Have strong hypotheses focused on ‘learning’
rather than winning
- Don’t be afraid to ‘lose’
- Iterate, iterate, iterate!
- Caveat: That doesn’t mean every test you run is
test to learn
- It’s a reframe that you should try to ‘learn’
with every test you can!
27. 1 in 7 A/B tests is a winning test
27
That stinks.
28. Always Be Testing
28
- If 1 in 7 tests ‘win’, it’s a numbers game to hit winners!
- You must balance quality with quantity of tests
- Bad: 50,000 button color tests
- Also Bad: 1 really cool, disruptive landing page test
- But it took you 6 months to build it.
- Also, it loses.
29. 29
Balance Quality vs. Quantity of Tests
Will require
more time to
dev/design
Can run this
quicker, and
learn quicker
30. Always Be Testing
30
- If 1 in 7 tests ‘win’, it’s a numbers game to hit winners!
- You must balance quality with quantity of tests with
- Bad: 50,000 button color tests
- Also Bad: 1 really cool, disruptive landing page test
- But it took you 6 months to build it.
- Also, it loses.
- Parallelism
31. Parallelism
31
- Parallelism means concurrently
- Building tests
- Running tests
- Analyzing past tests
Building Test
Test Running
Analyzing Results
This is bad
Building Test
Test Running
Analyzing Results
32. Parallelism
32
- Parallelism means concurrently
- Building tests
- Running tests
- Analyzing past tests
Building Test
Test Running
Analyzing Results
This is
parallelism!
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Building Test
33. 33
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Building Test
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Analyzing Results
Double the
amount of
tests run here!
35. Let’s do a quick thought exercise
35
Country Total Cases New Cases Total Deaths New Deaths
Indonesia 2491 +218 209 +11
Thailand 2220 +51 26 +3
Serbia 2200 +292 58 +7
Finland 2176 +249 27 -1
México 2143 +253 94 +15
UAE 2076 +277 11 +1
Snapshot of Coronavirus Data in 2020
36. Let’s do a quick thought exercise
36
Country Total Cases New Cases Total Deaths New Deaths
Indonesia 2491 +218 209 +11
Thailand 2220 +51 26 +3
Serbia 2200 +292 58 +7
Finland 2176 +249 27 -1
Mexico 2143 +253 94 +15
UAE 2076 +277 11 +1
Snapshot of Coronavirus Data in 2020
ZOMBIES!!!!!
37. Defining Your Metrics
37
- Your interpretation of the data hinges on your
understanding of how the metrics are set up
- Do you know how your metrics are defined/set up?
- Are your metrics working the way you are intending
them to work?
- Regular audits to ensure they’re still working
properly
- Collaboration with engineering
38. Look At The Right Metrics
38
- CRO ≠ Fixation on Conversion Rate
- Ensure you’re looking at lifetime value metrics
- e.g. AOV, repeat purchases, etc.
- Microconversions are important!
39. Look At The Right Metrics - Microconversions
39
- Microconversions (e.g. visit to product page, add to
cart, etc.)
- Help you define ‘behavior’
- Macroconversions (e.g. purchase complete)
- Drive $$$
- Microconversions help tell you why something
happened
- Define microconversions before test launch!
40. Look At The Right Audiences
40
- Make sure you’re segmenting your audiences - not everyone needs the same
exact experience
- Example segments:
- Desktop vs. Mobile
- Source (Bing vs. Google, Paid Search vs. Direct traffic)
- Current vs. New customers
- Specific Categories
- Experiment-specific segments
41. Supplement Your Experiment Data
41
- Qualitative data is helpful to supplement quantitative data
- Heatmaps to support adding a new element = more attention paid to it
- Running a survey during the experiment
- This is why collaboration with UX is so critical!
43. Playing ‘Nicely’ As The Experimenter
43
- Hearing all test ideas!
- Understand the ‘why’
- No test idea is a bad test idea
- Transparency into CRO program to foster
participation
- Bringing excitement/gamification to CRO
44. How to Defeat A HiPPO
44
- HiPPO: Highest Paid Person’s Opinion
- Strategies to Vanquish the HiPPO
- Loop HiPPOs early into the test build/design
process
- Is it the results? Or the design?
- Understand their POV
- Identify what they’re looking to solve (e.g. a test
which lifts AOV when they want to improve CVR)
46. Wrap Up
46
Value of
Experimentation
Make sure you’re
communicating the value
of experimentation
regularly!
Check Your Data
Your data can be your
friend, or your worst
enemy. Make sure it’s
accurate, and make sure
you know exactly what
you’re tracking. Track
microconversions!
Test to Learn
Position your tests to
put you in the best
position to learn.
Learnings are your
friends to winning. Plus,
learnings are valuable
outside of the
experimentation
function
47. Does anyone have any questions?
shivamanjunath91@gmail.com
Find me on LinkedIn - Shiva Manjunath!
47
Thanks!
48. What You're Doing Wrong
In Your CRO Program
Shiva Manjunath, CRO Guru
49. Credits
Did you like the resources on this template? Get them for free at our other websites.
Presentation template by Slidesgo
Icons by Flaticon
Images & infographics by Freepik
Author introduction slide photo created by katemangostar - Freepik.com
Big image slide photo created by jcomp - Freepik.com
Text & Image slide photo created by rawpixel.com - Freepik.com
Text & Image slide photo created by Freepik
49