One of the main concerns of conversion optimization professionals and their clients is the defined KPI (key performance indicator) of the test. If you are exclusively keeping an eye on one data point, it can easily muddle your insight. This can add to frustrations and the team may lose confidence in the process.
There's value in a failed experiment.
This talk by Christopher Nolan of Shipbob will give you an understanding of segregating good and "bad" test results through deeper analysis. He will share practical examples from his years of experience of working previously with Conversion Sciences, BigCommerce, and now Shipbob.
Beyond the Primary KPI: Leveraging Bad Test Results | Masters of Conversion by VWO
1. Insights Beyond the Primary KPI:
Leveraging “Bad” Test Results
Christopher Nolan
Head of Growth
2. Presenter - Chris Nolan (not the director)
About me: I’m a data-driven, hyper-focused
growth strategist with expertise in
conversion optimization, web analytics, and
digital acquisition strategy. Growth is my
priority. I love combining seemingly
disparate systems and solutions to achieve
a friction-reduced and
conversion-optimized funnel for B2B and
B2C companies.
Current Role: Head of Growth @ Shipbob
Outside of work you’ll probably find me on
a bicycle or spending too much money on
food served on too-small plates.
3. Presentation Overview
1) Building a foundation for tracking + reporting on A/B test results using a
basic testing “Track-Stack”
2) Analyzing your test results
3) Communicating test results + other uses for web analytics tools
4. What this presentation does NOT include
● Tool set up beyond installation
● Hypothesis creation and testing program strategy
● Recommendations for any testing tool
● Intentional Bad / “Dad” jokes
5. Goals for the listener
● Be inspired to test without fear of failure and be able to communicate your
results with confidence
● Be armed with the support and desire to build your “track-stack” without a
developer needed
● Leave a little bit more curious than you were before we started
This information is valuable for all marketers, not just the technical folks. That
said, it definitely leans a bit technical and data-driven. I’ve included support
documentation throughout and will share a PDF of this presentation after the
webcast concludes.
7. Test Measurement: Tools
● Required: A/B Testing Tool
○ Main Purpose: Change on-page/in-app
experiences for a set allocation of traffic to test
hypotheses about user behavior/site
engagement, and measure impact of new
experiences on user-defined goals.
● Recommended: Website Analytics Tool
○ Main Purpose: Measure user behavior and site
engagement on both user-defined and
predefined/default metrics across your entire
site.
● Recommended: Tag Management Tool
○ Main Purpose: Manage your website “tags”
(scripts) without directly editing code.
8. Test Measurement: Testing Tools Only
● Pros:
○ Easy to define and replicate
primary and secondary goals /
KPIs across tests.
○ Built-in visual reporting
○ Built-in statistical significance
measurement and test
duration recommendations
● Cons:
○ Confined to tracking
user-defined goals exclusively
○ Limitations to how granularly
you can “slice” up reporting
and what can be tracked in
your goals*
○ Difficult to backfill data
beyond pageviews
*Some testing tools allow for custom
segmentation and tracking with enterprise
plans
9. Test Measurement: Testing Tool + Web Analytics
Tool & Tag Management Tool
● Pros:
○ All pros of testing tools plus...
○ Access to all predefined goals
available in the web analytics tool
(i.e. pageviews, landing pages,
time on site, etc.)
○ More flexibility with user-defined
goals using the tag management
tool (i.e. scroll tracking, link
tracking, form field engagement)
● Cons:
○ Requires integration setup with
testing tool that isn’t as easily
replicable (custom dimensions set
up for each test)
○ More work required for clear
visual test reporting
○ No built-in statistical significance
reporting.
10. Test Measurement: Setup and Installation
● Testing Tool (No Recommendation): Recommended installation in the
<head> of your site code.
○ All testing tools will provide installation scripts that you can hand to a developer or
implement through your CMS (content management system)
● Web Analytics (Google Analytics Recommended): Can install directly on the
site, through CMS or through Google Tag Manager
○ Getting Started with Google Analytics
● Tag Management (Google Tag Manager Recommended): Can install directly
or through your CMS
○ Getting Started with Google Tag Manager
11. Integrating testing tools with Google Analytics
Most testing tools will have specific directions on how to integrate with Google
Analytics.
● Top Testing Tool Integrations
○ More Info Here
○ More Info Here
● Creating custom dimensions in Google Analytics
● Creating segments from custom dimensions in Google Analytics
13. B2B Saas/Lead Gen:
● Navigation summaries
○ Valuable for tests on core-site
pages and information
architecture
● Element/Link Engagement
● Scroll depth
● Time on site (default in GA)
● Pages/session (default in GA)
● Form Field Engagement
○ Valuable for direct-response
landing pages
Things to look for outside of main KPIs in Analytics:
Ecommerce:
● All of the above plus...
● Cart Abandonment Rate
● Checkout Abandonment Rate
● Form Field Engagement on Checkout
● Engagement with elements on product pages
(reviews, etc.)
● Engagement with “terms of service” or other
links in your checkout (i.e. “edit cart”)
Since the examples moving forward are B2B Lead
Generation examples, I’ve included this
ecommerce testing checklist I helped craft at
Conversion Sciences so you don’t leave without
any applicable hypotheses.
14. Example 1: BigCommerce Pricing Page
Hypotheses: Based on B2B SaaS
best-practices, we should:
● Bring our pricing plans closer to the top of the
page and align the features with the plan
prices.
● Reduce the number of features in each plan
and call out differentiators for more expensive
plans
● Add in more robust FAQs
● Add additional CTAs in-between the top and
bottom of page
● Add in a toggle for monthly / annual pricing
● And, why not? Make the background at the top
of the page black
Control
Variant
15. BigCommerce Pricing Page - Main KPIs and Results
KPIs:
● Main KPI: Trial Conversion Rate
● Secondary KPI: Demo Conversion
Rate
Results:
● 4% decrease in trial conversion rate
● 25% decrease in demo conversion
rate
● Unhappy design and executive team
who’d worked on / waited on this
test for months....
16. BigCommerce Pricing Page - “Beyond the KPIs”
Within Google Analytics, three pieces of data allowed us to turn this
“failed” test into our next opportunity:
● Scroll depth tracking (native to Google Tag Manager)
○ Instructions here
● Element engagement tracking (native to Google Tag Manager)
○ Instructions here
● Navigation Summaries (native to Google Analytics)
○ Overview here
17. BigCommerce Pricing Page - Insight
● Insights (Scroll Depth and Element Engagement):
○ No change in how many users scrolled down to the bottom of
the “features chart”, despite a massive difference in scroll
length.
○ High engagement (15-20% of clicks) with our new
annual/monthly toggle.
○ Both scroll depth and toggle engagement correlated with higher
trial conversion rate.
■ Heatmaps can show but can’t correlate with conversion
Money maker
18. BigCommerce Pricing Page - Insight
● Insights (Navigation Summaries):
○ Saw that payments link (bottom right corner) was getting tons of
clicks in control and few in variant
○ Created a sequence segment (more info here) for users in our
control who had visited the /payments/ URL.
○ Turns out those users were 2x more likely to convert to trial than
users who did not visit that page.
Money maker
19. BigCommerce Pricing Page - Insight
● Insights (Device Category):
○ Always check results across
devices.
○ Aggregate results: -4% trial CR,
-25% demo CR
■ Mobile Results: +10% trial
CR, no change to Demo CR
● Insights (Element Engagement):
○ Saw similar interaction across
both devices with “try it free
CTA” above fold, huge uptick in
CTA below the fold for mobile
with less scroll needed
20. BigCommerce Pricing Page - Next Steps
● What drove desired behavior or wasn’t measurable in changing behavior:
○ Original white layout plus better plan alignment with features
○ Retain new mobile layout with desktop color scheme
○ Added back in the payments link
○ Added in the annual/monthly toggle
Tested against original control
● Result: ~15% lift in trial CR, ~45% lift in demo CR
21. Example 2: ShipBob Pricing Page - Control
● ShipBob offers two main avenues for conversion:
○ Free Account Creation - “Get Started Now”
○ Talk to sales - “Request a Quote”
● On our pricing page, our main CTA above the fold was
“Get Started Today”, and we had a link in “no-users
land” (below the fold, bottom right) that led to
requesting a quote.
● Guess what we found…..
22. ShipBob Pricing Page - Navigation
● 11.5% of all users who engaged
with the page clicked on that
link to fill out a quote form
● And users who clicked had a
much higher conversion rate
(~70%) compared to our typical
conversion rate
23. ShipBob Pricing Page - Hypothesis
● Hypothesis: With the knowledge that
users who click on the quote link
convert at such a higher rate than
those who click on the main “Get
Started” CTA, we should test “Request a
Quote” as the main CTA.
● Main KPI:
○ Conversions (Quote requests and
accounts created combined)
● Secondary KPI:
○ Engagement with “fill out the
form” link considering change to
the above the fold CTA (would
users still engage with the form
link even if we changed the top
CTA to a quote)?
24. ShipBob Pricing Page - Results
Results:
● 104% Increase in Quote
Requests
● 33% reduction in Accounts
Created
● 31% aggregate lift in CR
Question unanswered by this report:
● Was it the button or the link that
drove the uptick in quote
requests in the variation?
25. ShipBob Pricing Page - Beyond the KPI
● link tracking (native to Google Tag Manager) set
up across the entire site.
● Pushes in every link click with the following
information into our Google Analytics instance
as an event:
○ Event Category: “Links on Links”
○ Event Action: {{Click Text}}
○ Event Label: {{Destination URL}}
● Found that “Request a quote” button drove
substantial amount of conversions, but “fill out
the form link” (right) still resulted in high
conversion rate post click.
● Kept both!
27. Communicating Results: Advice from a seasoned “loser”
● “Build your brand” internally: Sell confidence in the thoroughness of
your data and your analyses. Most executives I’ve worked with are as
interested in learning from the tests as they are in seeing immediate
results.
● Don’t fib and don’t bury the main KPI: Yes, it’s super cool that you
found out how important that tiny link was to the business. Don’t
start with that. Lay out a clear agenda for your presentation that
somewhat follows this path:
○ Hypothesis, Rationale, and Main KPIs
○ Screenshots of the variant(s)
○ Results: Main KPI
○ Insight: Beyond the Main KPI
○ Next Steps
28. Communicating Results: Advice from a seasoned “loser”
● Link to your data: Especially to begin with. Always include a (source) link on
your results slides. This will help sell the confidence in the data and will ease
the minds of the more data-driven folks in the room.
● Don’t be afraid to show “how the sausage is made”: This is especially valuable
for tech-savvy audiences. I often showcase my methods for integrating certain
tools in either the “results” or “insights” section of my test decks.
● Watch for sampling: If you’re segmenting views within Google Analytics, there’s
a chance that the tool will start to sample the data. Don’t report on that! You
can see if that’s the case by looking for the yellow checkmark in the top left
corner. If you have Google Analytics 360, you can pull unsampled reports. If
you have the free version, either remove your segment or reduce the date
range.
Not Reliable
29. Leveraging GTM to it’s full potential for web analytics
Here are some of the things I currently use GTM for in my role at Shipbob
to push data into Google Analytics:
● Video Engagement (Wistia integration to GA)
● Chatbot Engagement (Built-in through Intercom)
● Scroll Depth and Link Tracking
● Form Field Engagement (massively helpful in understanding drop-off
points within your forms, especially for direct response pages)
● Conversion Pixels
● Pushing GA Client ID as a custom dimension (allows for feedback loop
between google analytics and salesforce)
● And many more what some would say “superfluous” track events!
30. Basic Recommendations for Google Analytics
If you are the type who loves a good virtual training, Google Analytics has
full self-serve support documentation and video available. I recommend
focusing on the following for enabling testing success:
● Creating and leveraging segments
● Creating custom dimensions
● Navigation Summaries
● Creating and Tracking Goals
○ Especially for cart abandonment and funnel analysis
● Pre/Post Date Comparisons
● Ecommerce
31. Q/A
(P.S. I love answering questions, regardless of how
“simple” or complex they may be)