Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.
A/B-testing Mistakes
& Quick Fixes
CRO Expert &
A/B-testing Ninja
Partner Manager
Benelux & Nordics
Your hosts
Nr 1 Website Optimization Platform
Delivering best customer experiences at every touch point on the web and mobile apps
Nr 1 in Scandinavia
on Online Conversion Rate Optimization
The only 2 Star Solution Partner
in Scandinavia!
Agenda
● Brief overview of A/B-testing
● Common A/B-testing mistakes
● Some customer cases
● Summary
● QA
Brief overview
of A/B-testing
➔ An optimization method that involves testing different
versions of a web page (or app)
➔ The variations are identical ex...
Here’s how it works
● Visitors are randomly selected to see different variations
(cookies are stored)
● Keeping track of your KPIs
● Downloadi...
➔ To learn more about the visitor’s
behaviour in order to formulate new
hypotheses
➔ We want to achieve our online goals e...
10 common mistakes
Testing area
If there is an obvious opportunity to shift behaviour, expose insight
or increase number conversions
You test...
No (analytics) integration
● Troubleshooting tests
● (Segmenting results)
● Test that “flipp”
● Tests that don’t make any ...
Best-in-class Integrations
Your test will finish in 100 years!
★ Use a test duration calculator
★ https://www.optimizely.com/resources/sample-size-ca...
You draw conclusions
based on an ongoing test
Optimizely’s Stats Engine
● New way of measuring significance in a dynamic environment
Results
● Make a decision as soon a...
● Segmenting
● Customer service
● Session replay
● Eyetracking
● User testing
● Form analytics
Your hypothesis is crap
Use...
Solution:
Question your ideas
http://dah.la/hypothesis-creator
Read the blog post about how to use the formula
https://con...
Magine TV
Internet TV Streaming Service
The challenge:
More leads without changing the sign-up
The landing page
Scroll map analysis
Generates a map based on where the visitors of your
website click or scroll
The analysis with Google Analytics
In the funnel visualization
reports we found a bigger
drop off between signup
and thank...
The Hypothesis
Since we have observed that [We have a big drop off between the
Signup and the Thank you Page]. By [Analyzi...
The test
Original Variation
KEY CHANGES:
Move the instructions to
the top of the page
Variation 1 outperforms the Original
Micro
Conversion
Goal
Micro
Conversion
Goal
Macro
Conversion
Goal
Your tests are not prioritized
Opportunity
High
Low
LowEffortHigh
Opportunity factors
to take into
consideration:
➔ Comple...
Spotify
Spotify’s Premium Trial Flow
Original
● Premium Trial page (US)
● High drop off
● Asked to provide credit
card details to start the
premium trial
● User testing
● Short survey →
○ Data shows that the
primary reason to not
start the premium
trail is
■ Does not want to
...
Test Hypothesis
“Eftersom att vi med DATAANALYS
har observerat att en stor del av de som
lämnar premiumflödet (i data) gör...
Hypothesis: “Give the user a reason...”
We only use this to verify your account, you
won't be charged anything for your tr...
Test Results
C. “Because of our music...”
B. “Verify your account…”
D. “If you want to continue...”
A. Original
Variations...
You run a “bad” test
Swedoffice
B2B E-Commerce Site
Original
Solution
Test Results
No difference between the variations
A/B-test (1)
Original Variation
Why?!
Retake
A/B-test (2)
Conversions + 6%
Revenue per Visitor + 10%
Original Variation
You don’t isolate the variations and
end up with no change
Different Traffic Sources
not taken into consideration
Maximize ROI on your PPC investment
Optimizely
How Optimizely Maximized
ROI on their PPC investment
Google Keyword-Insertion
Creating Symmetry
Original
Variation
Results
39% Increase in # of Sales Leads
Bounce Rate Decreased
Google Quality Score went up
Cost per Lead went down
Do not get risky - be aware of bugs
- Make sure not to direct all traffic to a “broken” or bad
performing variation
- Prev...
Phased Rollouts
Phased Rollouts
The Sad Story...
Using code blocks to be flexible
Phased Rollouts
The Happy Story...
Inkcards’ challenge
Phased Rollouts
Summary: Common Testing Mistakes
➔ You test everything on
your site
➔ No integrations
➔ Your test will finish in a 100
yea...
Key take aways
1. The only bad test is the one where you don’t learn anything
2. Expect the unexpected
3. Only test where ...
Q&A
Can you afford to miss?
Why you can’t miss it > conversionjam.se
Thanks!
CRO Expert
& A/B-testing Ninja
Partner Manager
Benelux & Nordics
conversionista.se
optimizely.com
Webinar: Common Mistakes in A/B Testing
Webinar: Common Mistakes in A/B Testing
Webinar: Common Mistakes in A/B Testing
Webinar: Common Mistakes in A/B Testing
Webinar: Common Mistakes in A/B Testing
Nächste SlideShare
Wird geladen in …5
×

Webinar: Common Mistakes in A/B Testing

596 Aufrufe

Veröffentlicht am

This webinar aims to show you how to avoid the common pitfalls that can affect your conversion rate optimization strategy. At the end of this webinar you will be able to:

1) Use A/B Testing best practices to get the most out of it
2) Identify the common mistakes preemptively
3) Appreciate how A/B Testing fits your whole conversion rate optimization strategy (based on real cases, such as Swedoffice and Spotify)
hosted by Simon Dahla from Conversionista Sweden and Martijn Janssen, Partner Manager Optimizely

Veröffentlicht in: Internet
  • Als Erste(r) kommentieren

  • Gehören Sie zu den Ersten, denen das gefällt!

Webinar: Common Mistakes in A/B Testing

  1. 1. A/B-testing Mistakes & Quick Fixes
  2. 2. CRO Expert & A/B-testing Ninja Partner Manager Benelux & Nordics Your hosts
  3. 3. Nr 1 Website Optimization Platform Delivering best customer experiences at every touch point on the web and mobile apps
  4. 4. Nr 1 in Scandinavia on Online Conversion Rate Optimization The only 2 Star Solution Partner in Scandinavia!
  5. 5. Agenda ● Brief overview of A/B-testing ● Common A/B-testing mistakes ● Some customer cases ● Summary ● QA
  6. 6. Brief overview of A/B-testing
  7. 7. ➔ An optimization method that involves testing different versions of a web page (or app) ➔ The variations are identical except for a few things that might affect a user's behavior ➔ Calculations are made to see if the effect is not coincidence What is A/B-testing?
  8. 8. Here’s how it works
  9. 9. ● Visitors are randomly selected to see different variations (cookies are stored) ● Keeping track of your KPIs ● Downloading content from the cloud or redirecting the visitor to a different URL An A/B-testing tool in a nutshell Three primary things
  10. 10. ➔ To learn more about the visitor’s behaviour in order to formulate new hypotheses ➔ We want to achieve our online goals e. g. increased sales or more leads Why should you test?
  11. 11. 10 common mistakes
  12. 12. Testing area If there is an obvious opportunity to shift behaviour, expose insight or increase number conversions You test everything Just Do It (JFDI) Issues where a fix is easy to identify or the change is a no-brainer Put your findings into buckets Explore You need more information to triangulate the problem. If an item is in this bucket, you need to do further digging, more data points (red = not suitable for testing)
  13. 13. No (analytics) integration ● Troubleshooting tests ● (Segmenting results) ● Test that “flipp” ● Tests that don’t make any sense ● Broken test ● What drives the difference
  14. 14. Best-in-class Integrations
  15. 15. Your test will finish in 100 years! ★ Use a test duration calculator ★ https://www.optimizely.com/resources/sample-size-calculator ★ http://apps.conversionista.se/visual-test-duration-calculator/
  16. 16. You draw conclusions based on an ongoing test
  17. 17. Optimizely’s Stats Engine ● New way of measuring significance in a dynamic environment Results ● Make a decision as soon as you see significant results ● Test many goals and variations accurately at the same time ● No extra work for experimenters Traditional Statistics Stats Engine Percent of tests with winners or losers declared 0.36 0.22 Percent of tests with a change in significant declaration 0.37 0.04
  18. 18. ● Segmenting ● Customer service ● Session replay ● Eyetracking ● User testing ● Form analytics Your hypothesis is crap Use input from: ● Search analysis ● A/B-testing ● Web analysis ● Competitors ● Customer contacts ● Surveys
  19. 19. Solution: Question your ideas http://dah.la/hypothesis-creator Read the blog post about how to use the formula https://conversionista.se/ab-test-hypoteser/ I A R
  20. 20. Magine TV Internet TV Streaming Service
  21. 21. The challenge: More leads without changing the sign-up
  22. 22. The landing page
  23. 23. Scroll map analysis Generates a map based on where the visitors of your website click or scroll
  24. 24. The analysis with Google Analytics In the funnel visualization reports we found a bigger drop off between signup and thank you page than between landing page and signup page
  25. 25. The Hypothesis Since we have observed that [We have a big drop off between the Signup and the Thank you Page]. By [Analyzing the data in Google Analytics And Crazy Egg]. We want to [Move up the “Instructions”] which should lead to [more people signing up]. The effect will be measured by [the number of people signing up] http://dah.la/hypothesis-creator The hypothesis formula
  26. 26. The test Original Variation KEY CHANGES: Move the instructions to the top of the page
  27. 27. Variation 1 outperforms the Original Micro Conversion Goal Micro Conversion Goal Macro Conversion Goal
  28. 28. Your tests are not prioritized Opportunity High Low LowEffortHigh Opportunity factors to take into consideration: ➔ Complexity ➔ Resources ➔ Decisions Effort factors to take into consideration: ➔ Potential ➔ Scale ➔ Goal
  29. 29. Spotify Spotify’s Premium Trial Flow
  30. 30. Original ● Premium Trial page (US) ● High drop off ● Asked to provide credit card details to start the premium trial
  31. 31. ● User testing ● Short survey → ○ Data shows that the primary reason to not start the premium trail is ■ Does not want to give away their credit card details Input
  32. 32. Test Hypothesis “Eftersom att vi med DATAANALYS har observerat att en stor del av de som lämnar premiumflödet (i data) gör det p.g.a. att de INTE VILL GE BORT sina betaluppgifter kommer vi säga VARFÖR de måste ge det vilket kommer leda till att fler gör det. Något vi kommer att mäta i antal köp.” http://dah.la/hypothesis-creator The hypothesis formula
  33. 33. Hypothesis: “Give the user a reason...” We only use this to verify your account, you won't be charged anything for your trial We need this because our music deals only allow free trials for users that are credit card or PayPal holders We need this just in case you decide to stay Premium after your free month B C D
  34. 34. Test Results C. “Because of our music...” B. “Verify your account…” D. “If you want to continue...” A. Original Variations CC PAGE Thank You page
  35. 35. You run a “bad” test
  36. 36. Swedoffice B2B E-Commerce Site
  37. 37. Original
  38. 38. Solution
  39. 39. Test Results No difference between the variations A/B-test (1) Original Variation
  40. 40. Why?!
  41. 41. Retake
  42. 42. A/B-test (2) Conversions + 6% Revenue per Visitor + 10% Original Variation
  43. 43. You don’t isolate the variations and end up with no change
  44. 44. Different Traffic Sources not taken into consideration Maximize ROI on your PPC investment
  45. 45. Optimizely How Optimizely Maximized ROI on their PPC investment
  46. 46. Google Keyword-Insertion
  47. 47. Creating Symmetry
  48. 48. Original
  49. 49. Variation
  50. 50. Results 39% Increase in # of Sales Leads Bounce Rate Decreased Google Quality Score went up Cost per Lead went down
  51. 51. Do not get risky - be aware of bugs - Make sure not to direct all traffic to a “broken” or bad performing variation - Preview your variations in cross browser tests - Use phased rollouts to avoid dissatisfaction
  52. 52. Phased Rollouts
  53. 53. Phased Rollouts The Sad Story...
  54. 54. Using code blocks to be flexible
  55. 55. Phased Rollouts The Happy Story...
  56. 56. Inkcards’ challenge
  57. 57. Phased Rollouts
  58. 58. Summary: Common Testing Mistakes ➔ You test everything on your site ➔ No integrations ➔ Your test will finish in a 100 years ➔ You draw conclusions based on an ongoing test ➔ You put in too little effort on your hypothesis ➔ Your test isn’t prioritized ➔ You don't learn anything ➔ You change everything at once ➔ You don't account for different traffic sources ➔ Be aware of bugs
  59. 59. Key take aways 1. The only bad test is the one where you don’t learn anything 2. Expect the unexpected 3. Only test where you can trigger a behaviour change - where we make decisions 4. Formulate your test hypothesis WELL !important REMEMBER & DON’T FORGET
  60. 60. Q&A
  61. 61. Can you afford to miss? Why you can’t miss it > conversionjam.se
  62. 62. Thanks! CRO Expert & A/B-testing Ninja Partner Manager Benelux & Nordics conversionista.se optimizely.com

×