Google is a black box, and for almost 20 years SEOs have run experiments and tested ideas trying to understand what makes the search engine tick. Until recently, it's been really hard to run robust tests that isolate the effects of SEO changes. At Distilled, we have been using new tools and statistical approaches to run split-tests. In this session, Tom is going to talk about how you can run your own A/B tests, some of the experiments we've run and the results we've seen, and share some thoughts about the future of SEO testing.
51. 3 Steps to DIY SEO Split-Tests
1. Create two buckets of pages.
2. Make a change to all pages in one bucket.
3. Analyse which bucket performs better.
52. 1. Create two buckets of pages.
A good DIY approach is using GA segments. Easy to make (see link).
Suggestion: Create Segments by category (e.g. blog tag, product category).
53. 2. Make a change to all pages in one bucket.
Control Variant
This will be your variant bucket.
Utilise your CMS where possible. Ask Dev for scalable approach.
54. 3. Analyse which bucket performs better.
A naive approach can just compare absolute traffic, if you had a close match before.
But needs a big uplift and hard to create such Segments.
Better approach is using Google's Causal Impact library — read this great Lunametrics post.
Variant
Control
58. ConcertHotels.com: Test Setup
~20,000 location category pages
Pages
Title: <<Location>> Hotels, NY | ConcertHotels.com
H1: <<Location>> Hotels
Before (Control)
Title: Hotels near <<Location>>, NY | ConcertHotels.com
H1: Hotels near <<Location>>
After (Variant)
59. ConcertHotels.com: Results
2.5 weeks to get to significance.
Gradual improvement in organic performance leading to steady amount of higher traffic.
Results
61. SmokyMountains.com: Test Setup
~100 lodging pages
Pages
schema.org markup: @type “WebSite”. Generic on all pages.
Before (Control)
schema.org markup: @type “LodgingBusiness”. Customised to each page.
After (Variant)
62. SmokyMountains.com: Results
Fewer test pages than previous test, so it is less smooth but is detected much quicker.
Traffic uplift here is estimated to be ~5%.
Results
64. iCanvas.com: Test Setup
~3200 artist category pages
Pages
meta description: Shop our selection of canvas prints by Banksy, each hand-stretched over
museum-quality bars and printed with brilliant, fade-resistant inks. Free shipping and returns.
internal links: ~50 self referential links
Before (Control)
meta description: Banksy Prints on Canvas, including There Is Always Hope Balloon Girl, Life Is
Beautiful and others. Free shipping and returns.
internal links: <removed>
After (Variant)
65. iCanvas.com: Results
2 weeks to get significance.
Number of pages is between the previous tests: a few days to be noticed but quite smooth.
Results
73. You can’t assume traffic equality
between “buckets” of pages
This is why we build a counterfactual comparison using control pages.
Use Google’s Causal Impact library to do it yourself.
74. Pay attention to:
Amount of traffic
& number of pages
These two factors will determine how quickly you can test and what size uplifts you can detect.
75. New pages that
appear during tests
The simplest approach is to just ignore all new pages that didn’t exist before the test started.
76. Different pages can have different
seasonality
For example, “roses” pages on valentine’s day. You need to cut outliers.
77. You could damage conversions,
so pay attention to those metrics too.
Tag people in your analytics depending if they arrive on Control or Variant
84. www.distilledodn.com
DistilledODN allows you to test exactly which changes to your
website will result in an uplift in traffic from search engines.
Our new SEO A/B Testing platform is available.
@TomAnthonySEO