I'm presenting at eMetrics here in San Francisco but also attended a lot of sessions myself. I listened to the questions and included the most common CRO niggles or issues as part of my deck. It also includes details of analytics tips (particularly Google analytics but can be applied elsewhere) and details of a good CRO toolkit to have in your pocket.
Lastly, I've included a BONUS DECK here. Yay. You'll find some details on the methodologies I support and some of the results that happened. I love lean techniques blended with rapid support from analytics, split testing, rapid UX techniques and many other toolsets.
2. "If you go to the men's washrooms at
the Schiphol airport in
Amsterdam, you may notice there's a
fly in the urinals. It’s screen printed .
So what do you think most men do?
That's right, they aim at the fly when
they urinate.
They don't even think about it, and
they don't need to read a user's
manual; it's just an instinctive
reaction that means 85% less spillage!
The interesting feature of these
urinals is that they're deliberately
designed to take advantage of this
inherent human male tendency.“
This is my job.
2
3. Director of Optimization, RUSH Hair
Building a team, conversion rate or methodology?
Shameless Promotion Slide
• 35M visitor split tests
and counting
• Over $400M increases
in revenue for
clients, within 5 yrs
• Lifts from 12% to
200+% in site-wide
conversion rates
• 21 years of my life
slowly being sucked
away in boring
meetings with time
wasting morons.
• UX and Analytics (1999)
• User Centred Design (2001)
• Startups and advisory (2003)
• Funnel optimisation (2004)
• Multivariate & A/B (2005)
• Lean UX (2008)
• Holistic Optimisation (2009)
• I love ooptimizing
underperforming stuff :
websites, teams, businesses
and multi-country
optimisation programs!
@OptimiseOrDie
3
5. Agenda
#1 The Optimisers Toolkit
The best tools recommended by CRO practitioners
#2 Analytics Genius Tips
Top tips from 2013 – from around the world
#3 Top CRO questions
Some answers from my Top 30 CRO questions
5
7. #1 : Session Replay
• 3 kinds of tool :
Client side
• Normally Javascript based
• Pros : Rich mouse and click data,
errors, forms analytics, UI interactions.
• Cons : Dynamic content issue, Performance hit
Server side
• Black Box -> Proxy, Sniffer, Port copying device
• Pros : Gets all dynamic content, fast, legally tight
• Cons : No client side interactions, Ajax, HTML5 etc.
Hybrid
• Clientside and Sniffing with central data store
7
8. #1 : Session Replay
• Vital for optimisers & fills in a ‘missing link’ for insight
• Rich source of data on visitor experiences
• Segment by browser, visitor type, behaviour, errors
• Forms Analytics (when instrumented) are awesome
• Can be used to optimise in real time!
Session replay tools
• Clicktale (Client) www.clicktale.com
• SessionCam (Client) www.sessioncam.com
• Mouseflow (Client) www.mouseflow.com
• Ghostrec (Client) www.ghostrec.com
• Usabilla (Client) www.usabilla.com
• Tealeaf (Hybrid) www.tealeaf.com
• UserReplay (Server) www.userreplay.com 8
12. #2 : Feedback / VOC tools
• Anything that allows immediate realtime onpage feedback
• Comments on elements, pages and overall site & service
• Can be used for behavioural triggered feedback
• Tip! : Take the Call Centre for beers
• Kampyle
www.kampyle.com
• Qualaroo
www.qualaroo.com
• 4Q
4q.iperceptions.com
• Usabilla
www.usabilla.com
12
13. #2a : Survey Tools
• Surveymonkey www.surveymonkey.com (1/5)
• Zoomerang www.zoomerang.com (3/5)
• SurveyGizmo www.surveygizmo.com (5/5)
• For surveys, web forms, checkouts, lead gen – anything with
form filling – you have to read these two:
Caroline Jarrett (@cjforms)
Luke Wroblewski (@lukew)
• With their work and copywriting from @stickycontent, I
managed to get a survey with a 35% clickthrough from email
and a whopping 94% form completion rate.
• Their awesome insights are the killer app I have when
optimising forms and funnel processes for clients.
13
15. #4 : Guerrilla Usability Testing
• All you need is a device, time and people!
• Use one of these tools for session recording:
CamStudio (free)
www.camstudio.org
Mediacam AV (cheap)
www.netu2.co.uk
Silverback (Mac)
www.silverbackapp.com
Screenflow (Mac)
www.telestream.net
UX Recorder (iOS), Reflection, Webcam
www.uxrecorder.com & bit.ly/tesTfm & bit.ly/GZMgxR 15
20. • Seriously wasting your time doing manual Excel?
• Fed up doing stuff that takes hours?
• Use the Google API to roll your own reports straight into Big G
• Lots of good articles but ask for advice from:
@danbarker
@timlb
#measurecamp
• Google Analytics + API + Google docs integration = A BETTER LIFE!
• Hack your way to having more productive weeks
• Learn how to do this and to have fun with GA custom reports
• Ask me about the importance of training
#5 Google Docs
20
30. Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)
Usertesting (B) www.usertesting.com
Userlytics (B) www.userlytics.com
Userzoom (S) www.userzoom.com
Intuition HQ (S) www.intuitionhq.com
Mechanical turk (S) www.mechanicalturk.com
Loop11 (S) www.loop11.com
Open Hallway (S) www.openhallway.com
What Users Do (P) www.whatusersdo.com
Feedback army (P) www.feedbackarmy.com
User feel (P) www.userfeel.com
Ethnio (For Recruiting) www.ethnio.com
Feedback on Prototypes / Mockups
Pidoco www.pidoco.com
Verify from Zurb www.verifyapp.com
Five second test www.fivesecondtest.com
Conceptshare www.conceptshare.com
Usabilla www.usabilla.com
#7 – UX and Crowd tools
30
31. #8 : Web Analytics Love
• Properly instrumented analytics
• Investment of 5-10% of developer time
• Add more than you need
• Events insights
• Segmentation
• Call tracking love!
31
32. #8 : Tap 2 Call tracking
Step 1 : Add a unique phone number on ALL channels
(or insert your own dynamic number)
Step 2 : For phones, add “Tap to Call” or “Click to Call”
• Add Analytics event or tag for phone calls!
• Very reliable data, easy & cheap to do
• What did they do before calling?
• Which page did they call you from?
• What PPC or SEO keyword did they use?
• Incredibly useful – this keyword level call data
• What are you over or underbidding for?
• Will help you shave 10, 20%+ off PPC
• Which online marketing really sucks?
32
34. What about desktop?
Step 1 : Add ‘Click to reveal’
• Can be a link, button or a collapsed section
• Add to your analytics software
• This is a great budget option!
Step 2 : Invest in call analytics
• Unique visitor tracking for desktop
• Gives you that detailed marketing data
• Easy to implement
• Integrates with your web analytics
• Let me explain…
34
35. So what does phone tracking get you?
• You can do it for free on your online channels
• If you’ve got any phone sales or contact operation, this will
change the game for you
• For the first time, analytics for PHONE for web to claim
• Optimise your PPC spend
• Track and Test stuff on phones, using web technology
• The two best phone A/B tests? You’ll laugh!
35
36. Who?Company Website Coverage
Mongoose Metrics* www.mongoosemetrics.com UK, USA, Canada
Ifbyphone* www.ifbyphone.com USA
TheCallR* www.thecallr.com USA, Canada, UK, IT, FR, BE, ES, NL
Call tracking metrics www.calltrackingmetrics.com USA
Hosted Numbers www.hostednumbers.com USA
Callcap www.callcap.com USA
Freespee* www.freespee.com
UK, SE, FI, NO, DK, LT, PL, IE, CZ,
SI, AT, NL, DE
Adinsight* www.adinsight.co.uk UK
Infinity tracking* www.infinity-tracking.com UK
Optilead* www.optilead.co.uk UK
Switchboard free www.switchboardfree.co.uk UK
Freshegg www.freshegg.co.uk UK
Avanser www.avanser.com.au AUS
Jet Interactive* www.jetinteractive.com.au AUS
* I read up on these or talked to them. These are my picks.
36
38. #9 : Web Analytics Love
• People, Process, Human problems
• UX of web analytics tools and reports
• Make the UI force decisions!
• Playability and exploration
• Skunkworks project time (5-10%)
• Give it love, time, money and iteration
• How often do you iterate analytics?
• Lastly, spend to automate, gain MORE time
38
40. Analytics Genius Tips
#1 Performance tune-ups
#2 Browser money
#3 Keyboard shortcuts
#4 Ranking data
#5 Content engagement
#6 Enhanced In page
#7 Duplicate transactions
#8 Event tracking
#9 Google API + more 40
41. #1 : Performance Tune-ups
With thanks to @Danbarker
Add “_gaq.push(['_setSiteSpeedSampleRate', 100]);”
• Amps up sampling for small & medium websites
• Use the distribution report - % of pages < 3 seconds
• DOM timings vital – let me explain
• Avg. Document Content Loaded Time (sec)
• This data is very accurate and helps conversion – pretty vital for landing
pages, where I find lots of stuff
• Make yourself a [pageview * content load time] report
• This is called a ‘Suck Index’
• Work your way down from the top
• Mobile speed doesn’t count Safari – please be careful!
• Read more at:
http://p.barker.dj/sitespeedtips 41
42. #2 : Browser money
• Create a desktop only segment *Exclude “mobile including tablets” +
• Create a mobile only segment [ For GA, Include Mobile (including tablet) &
Exclude Tablet = yes]. 3 segments Desktop, Tablet, Mobile only
• Now start segmenting like this:
Browser Segment Conv rate
Safari Mobile Traffic 0.79%
Internet Explorer Desktop only 1.34%
Chrome Desktop only 1.01%
Safari Tablet Traffic 1.00%
Safari Desktop only 1.28%
Firefox Desktop only 1.20%
Android Browser Mobile Traffic 0.31%
Safari (in-app) Mobile Traffic 0.69%
Chrome Mobile Traffic 0.62%
Safari (in-app) Tablet Traffic 0.89%
Chrome Tablet Traffic 0.84%
Opera Desktop only 0.10%
Android Browser Tablet Traffic 0.71%
Mozilla Compatible Agent Mobile Traffic 0.51%
Quote the
opportunity!
IE8 = 1.41% revenue
IE8 converts at 20%
of IE9 and IE10
Some nasty bugs
cause the problem
Worth fixing?
That 1.41% problem
is worth nearly 6%
more checkouts! 42
43. #2 : Browser money
• You should see something like this:
43
44. #3 : GA Keyboard Shortcuts
• Thanks to Farid Alhadi and @fastbloke
d t Set date range to TODAY
d y Set date range to YESTERDAY
d w Set date range to LAST WEEK
d m Set date range to LAST MONTH
d c
Toggle date comparison mode (to the previous period of whatever you are looking at.
Example, if you’re looking at 6 days, this will compare it to the 6 days before it)
d x Toggle date comparison mode (to the previous year of the period you are looking at)
? Open keyboard shortcut help
h Search help center
a open account panel
shift + a Go to account list
s / Search reports
shift + d Go to the default dashboard of the current profile
44
45. #4 : Ranking data
• Someone searches on Google for ‘Term’
• Clicks on a link to your site
• What was the keyword rank for that term?
• Reverse engineer the actual rankings from users machines
• More accurate than some SEO tools (IMHO)
• Two articles to show you how:
http://bit.ly/Vaisno
http://bit.ly/13lmYF2
45
46. #5 : Content Engagement
• Get a better bounce rate metric
• See more detailed engagement metrics
• Measure scrolling and reading activity
• Came from @fastbloke but originally Justin Cutroni
• Measure your scrolling and exit points from long form
content
• Very nice technique – read more at:
http://bit.ly/13lmYF2
http://goo.gl/1AZZb
46
47. #6 : In Page Analytics - Enhanced
var _gaq = _gaq || [];
var pluginUrl =
'//www.google-analytics.com/plugins/ga/inpage_linkid.js';
_gaq.push(['_require', 'inpage_linkid', pluginUrl]);
_gaq.push(['_setAccount', 'UA-XXXXXX-Y']);
_gaq.push(['_trackPageview']);
• Correct link attribution for in-page Analytics
• Very nice
47
48. #7 : Duplicate transactions in GA
• Thanks to Matt Clarke and @timlb
• Stop skewing the data with duplicates/reloads
• Custom report to check if you’re affected
http://techpad.co.uk/content.php?sid=247
• I’ve seen this a few places so worth
checking, particularly if figures don’t tally!
• Read more at : http://bit.ly/13lmYF2
48
49. #8 : Event tracking
• Thanks to #Measurecamp – check the stream
• A beginners guide : http://bit.ly/13RFoJs
• Some great ideas here : http://bit.ly/UCcptx
• Don’t go for ‘Event Blizzard’
• Focus on specific areas where insight is needed
• Choose your naming structure carefully:
http://bit.ly/WJ4R4c
• Read this complete guide : http://bit.ly/VmFSJ4
49
50. #9 : Google API and more
• Use the Google API to get super custom reports
• You can fetch different data types (on the fly as
well as pre-calculated)
• Automate a HUGE CHUNK of Excel work
• @timlb recorded the #MeasureCamp session:
• Deck :
www.youtube.com/watch?v=JWXg1_4quwU
• Roundup :
www.measurecamp.org/aftermath/
50
52. #10 : Microdata – SERPS UX
• Reviews – huge increases in CTR and Conversion
• People (Authors)
• Products
• Businesses and Organisations
• Recipes
• Events
• Music
• Local
• Video
Helps to:
• Dominate the page
• Push other stuff down
• Makes it more persuasive
• The conversion journey starts here! 52
53. #11 : Measure viewport size
Thanks to @Beantin and others!
• Measure the viewport size, not the resolution
• Why?
• Toolbars, chrome and setup varies
• UK 2011 figure was 2.2 toolbars
• Code example here : http://bit.ly/4xaNYK
• A common conversion issue
• Your desk vs. Users = different
• Turn off the wifi, reduce the viewport
• The budget restroom solution
53
54. Best Practice?
• There is no such thing as a ‘readily repeatable
best practice’ in conversion optimisation
• The button color example
• There are patterns! - but the context varies
• The answer is always, “it depends” ;-)
• It starts with your customers, your site, your
data, your insights – not an article online!
• It starts and end with customer knowledge –
that’s best practice!
54
55. Top Conversion Questions
• 32 questions, picked by Practitioners
• Being recorded on ScreenR.com
• What top stuff did I hear this week?
“How long will my test
take?”
“When should I check
the results?”
“How do I know if it’s
ready?” 55
56. #1 How long will a test take?
• The minimum length
– 2 business cycles
– Always test ‘’whole’ not partial cycles
– Usually a week, 2 weeks, Month
– Be aware of multiple cycles
• How long after that
– IMHO you’ll need a minimum 250 outcomes, ideally 350 for each ‘creative’
– If you test 4 recipes, that’s 1400 outcomes
– Make a note of your minimum ‘length’ for 350 outcomes
– If you segment, you’ll need more data
– It may take longer than that if the response rates are similar*
– Work out how long it might take (or you can afford it to take)
http://visualwebsiteoptimizer.com/ab-split-test-duration/
* Stats geeks know I’m glossing over something here. That test time depends on how
the two experiments separate in terms of relative performance as well as how
volatile the test response is. I’ll talk about this when I record this one! This is why
testing similar stuff sux. 56
57. #2 – Are we there yet? Early test stages…
• Ignore the graphs. Don’t draw conclusions. Don’t dance. Calm down.
• Get a feel for the test but don’t do anything yet!
• Remember – in A/B - 50% of returning visitors will see a new shiny website!
• Until your test has had at least 1 business cycle and 250-350 outcomes, don’t
bother drawing conclusions or getting excited!
• You’re looking for anything that looks really odd – your analytics person should be
checking all the figures until you’re satisfied
• All tests move around or show big swings early in the testing cycle. Here is a very
high traffic site – it still takes 10 days to start settling. Lower traffic sites will
stretch this period further.
57
58. #3 – What happens when a test flips on me?
• Something like this can happen:
• Check your sample size. If it’s still small, then expect this until the test settles.
• If the test does genuinely flip – and quite severely – then something has changed with
the traffic mix, the customer base or your advertising. Maybe the PPC budget ran
out? Seriously!
• To analyse a flipped test, you’ll need to check your segmented data. This is why you
have a split testing package AND an analytics system.
• The segmented data will help you to identify the source of the shift in response to your
test. I rarely get a flipped one and it’s always something changing on me, without
being told. The heartless bastards.
58
59. #4 – What happens if a test is still moving around?
• There are three reasons it is moving around
– Your sample size (outcomes) is still too small
– The external traffic mix, customers or reaction has
suddenly changed or
– Your inbound marketing driven traffic mix is
completely volatile (very rare)
• Check the sample size
• Check all your marketing activity
• Check the instrumentation
• If no reason, check segmentation
59
60. #5 – How do I know when it’s ready?
• The hallmarks of a cooked test are:
– It’s done at least 1 or 2 (preferred) cycles
– You have at least 250-350 outcomes for each recipe
– It’s not moving around hugely at creative or segment level
performance
– The test results are clear – even if the precise values are not
– The intervals are not overlapping (much)
– If a test is still moving around, you need to investigate
– Always declare on a business cycle boundary – not the middle of
a period (this introduces bias)
– Don’t declare in the middle of a limited time period advertising
campaign (e.g. TV, print, online)
– Always test before and after large marketing campaigns (one
week on, one week off)
60
61. #6 – What happens if it’s inconclusive?
• Analyse the segmentation
• One or more segments may be over and under
• They may be cancelling out – the average is a lie
• The segment level performance will help you
(beware of small sample sizes)
• If you genuinely have a test which failed to move any
segments, it’s a crap test
• This usually happens when it isn’t bold or brave
enough in shifting away from the original design,
particularly on lower traffic sites
• Get testing again!
61
62. #7 – What QA testing should I do?
• Cross Browser Testing
• Testing from several locations (office, home, elsewhere)
• Testing the IP filtering is set up
• Test tags are firing correctly (analytics and the test tool)
• Test as a repeat visitor and check session timeouts
• Cross check figures from 2+ sources
• Monitor closely from launch, recheck
62
63. #8 – What happens if it fails?
• Learn from the failure
• If you can’t learn from the failure, you’ve designed a crap test.
Next time you design, imagine all your stuff failing. What would
you do? If you don’t know or you’re not sure, get it changed so
that a negative becomes useful.
• So : failure itself at a creative or variable level should tell you
something.
• On a failed test, always analyse the segmentation
• One or more segments will be over and under
• Check for varied performance
• Now add the failure info to your Knowledge Base:
• Look at it carefully – what does the failure tell you? Which
element do you think drove the failure?
• If you know what failed (e.g. making the price bigger) then you
have very useful information
• You turned the handle the wrong way
• Now brainstorm a new test
63
64. #9 – Should I run an A/A test first?
• No – and this is why:
– It’s a waste of time
– It’s easier to test and monitor instead
– You are eating into test time
– Also applies to A/A/B/B testing
– A/B/A running at 25%/50%/25% is the best
• Read my post here :
http://bit.ly/WcI9EZ
64
65. #10 – What is a good conversion rate?
Higher than the one
you had last month!
65
69. Ad Hoc
Local Heroes
Chaotic Good
Level 1
Starter Level
Guessing
A/B testing
Basic tools
Analytics
Surveys
Contact Centre
Low budget
usability
Outline process
Small team
Low hanging fruit
+ Multi variate
Session replay
No segments
+Regular usability
testing/research
Prototyping
Session replay
Onsite feedback
________________________________________________________________________
_____________________ _
Dedicated team
Volume
opportunities
Cross silo team
Systematic tests
Ninja Team
Testing in the
DNA
Well developed Streamlined Company wide
+Funnel
optimisation
Call tracking
Some segments
Micro testing
Bounce rates
Big volume
landing pages
+ Funnel analysis
Low converting
& High loss pages
+ offline
integration
Single channel
picture
+ Funnel fixes
Forms analytics
Channel switches
+Cross channel
testing
Integrated CRO
and analytics
Segmentation
+Spread tool use
Dynamic adaptive
targeting
Machine learning
Realtime
Multichannel
funnels
Cross channel
synergy
________________________________________________________________________
_______________________
________________________________________________________________________
________________________
Testing
focus
Culture
Process
Analytics
focus
Insight
methods
+User Centered
Design
Layered feedback
Mini product tests
Get buyin
_________________________________________________________________________
_______________________Mission Prove ROI Scale the testing Mine value
Continual
improvement
+ Customer sat
scores tied to UX
Rapid iterative
testing and
design
+ All channel view
of customer
Driving offline
using online
All promotion
driven by testing
Level 2
Early maturity
Level 3
Serious testing
Level 4
Core business value
Level 5
You rock, awesomely
________________________________________________________________________
________________________
69
70. HOMEWORK 1
• I’d like you to look at how unconscious action is part of your life every week.
• Several times a day, you’ll use a door. There are many different interfaces for
doors like handles, knobs, buttons, push plates, levers and more.
• We all go through every day using these things and don’t consciously think about
what we’re doing. You’ll use them at work, at home, when you travel, shop or use
the loo!
• There are several things you’ll spot if you keep a door diary for a few days. Let’s
give you 3 weeks to finish, to give you time to fit this in. Here is your work:
1. Explore.
See how many different types of door interface you can spot. Take pictures of
them and add notes on your phone or using an app. Take notes if you like with a
notepad and pencil. Photographs are especially useful for showing examples.
2. Patterns and Groups.
Do these door interfaces have a pattern? Do they fit into groups? What would
you call these groups?
70
71. HOMEWORK 2
3. The Furnishings. Take a look at the door furniture and signs:
• Is there any other stuff apart from the handle that you look at?
• What signs or stuff are plastered on the door?
• Are there any messages telling you stuff?
4. Error with door. What happens when it goes wrong?.
This is really hard to catch but if you keep trying for a week, you’ll spot a few. What happens when the door
interface goes wrong? You get it the wrong way, curse to yourself and then do something different. What do
you notice about when this happens?
5. What caused it?
What was it, when it all went wrong, that led you to ‘get the door wrong’ and have to try again. What went
wrong that didn’t happen with all the other doors? If you watch the door, does it happen to other people?
6. Summary
So keep a log if you can (scribbled notes, mobile phone app, photos) and look at the five things I’ve listed. There
might be more stuff than I’m hinting at so observe closely.
• How many different ‘kinds’ of door interface can you spot?
• Are there groups of them – similar kinds? What would you call these?
• Catch yourself when it goes wrong
• Watch other people when it goes wrong
• Why did it go wrong
71
72. Collecting the Evidence
Apps
• https://itunes.apple.com/au/app/this-is-note-calendar-+-
photoalbums/id403746123?mt=8
• https://itunes.apple.com/us/app/awesome-note-+to-do-
calendar/id320203391?mt=8
• http://www.blurb.com/mobile
Inbox or Stream based
• http://www.memonic.com/tour#web-clipper
• https://launch.unifiedinbox.com/
72
74. BONUS DECK
Hope you find this useful – a small bonus here with some
slides about conversion optimisation methodologies and
how you should try to structure your approach.
74
76. What’s the problem?
• #1 User Experience and Conversion
Optimisation are not a checkbox or a
step in the process – it needs to be
integrated in everything you do.
• #2 This work isn’t a one off exercise
either – it’s an ongoing continuous
improvement process – like Kaizen
• #3 Usability testing isn’t enough –
other UX factors like the visceral and
behavioural emotional responses we
have to products need tuning too.
• #4 It’s not just about the user!
• #5 It’s usually Self Centred Design
driven by Ego, Opinion, Assumption
76
78. Also, the dial won’t turn anymore
With thanks to @morys
PPC SEO
78
79. Why is this happening?
• PPC changes
• Advertising models flattening – i.e. mobile Google costs
• Comprehensive SEO changes
• Competition increasing
• Fleetness of foot – Asos in Australia
• Entry costs are lower now
• New entrants compete without the cruft
• Startups are using better ‘build and optimise’ methodologies
than nearly all corporates
• The old way of doing things is going to die
• The new way of doing things is your only survival ticket
79
80. So what do people do?
• They throw tools at the problem
• They try usability testing and research
• They generate more data to look at
• They make changes without measuring or testing
• They hope to randomly create the optimal system
• They get an expensive agency to help them
• They push their team harder, like galley slaves
• They experiment with riskier advertising models
• They wonder why they’re burning rubber
• They try more random things
• Then they call a CRO person and say
“We’ve tried everything. It isn’t working. Help!”
80
81. Skinner’s Pigeon Experiment
• Participants invited into a room with objects
• Told to score 100 points within 30 minutes
• Participants moved objects around, made
noise, jumped around, tried anything to make a
counter increase the points score.
• They got horribly confused
• They then created convincing lies for
themselves, to explain what they thought was
working. They became superstitious and made
rituals.
• The points allocation was made randomly by a
goldfish, swimming back and forward in a tank.
• Know any marketing departments like this?
• I’ve seen this a lot – and it paralyses companies
• We need a better way. A methodology?
81
83. Lean UX
Positive
– Lightweight and very fast methods
– Realtime or rapid improvements
– Documentation light, value high
– Low on wastage and frippery
– Fast time to market, then optimise
– Allows you to pivot into new areas
Negative
– Often needs user test feedback to
steer the development, as data not
enough
– Bosses distrust stuff where the
outcome isn’t known
“The application of UX design methods into product
development, tailored to fit Build-Measure-Learn cycles.”
83
84. Agile UX / UCD / Collaborative Design
Positive
– User centric
– Goals met substantially
– Rapid time to market (especially when
using Agile iterations)
Negative
– Without quant data, user goals can
drive the show – missing the business
sweet spot
– Some people find it hard to integrate
with siloed teams
– Doesn’t’ work with waterfall IMHO
Wireframe
Prototype
TestAnalyse
Concept
Research
“An integration of User Experience Design and Agile*
Software Development Methodologies”
*Sometimes
84
86. Lean Conversion Optimisation
Positive
– A blend of several techniques
– Multiple sources of Qual and Quant data aids triangulation
– CRO analytics focus drives unearned value inside all
products
Negative
– Needs a one team approach with a strong PM who is a
Polymath (Commercial, Analytics, UX, Technical)
– Only works if your teams can take the pace – you might be
surprised though!
“A blend of User Experience Design, Agile PM, Rapid Lean
UX Build-Measure-Learn cycles, triangulated data
sources, triage and prioritisation.”
86
88. Triage and Triangulation
• Starts with the analytics data
• Then UX and user journey walkthrough from SERPS -> key paths
• Then back to analytics data for a whole range of reports:
• Segmented reporting, Traffic sources, Device viewport and
browser, Platform (tablet, mobile, desktop) and many more
• We use other tools or insight sources to help form hypotheses
• We triangulate with other data where possible
• We estimate the potential uplift of fixing/improving something
as well as the difficulty (time/resource/complexity/risk)
• A simple quadrant shows the value clusters
• We then WORK the highest and easiest scores by…
• Turning every opportunity spotted into an OUTCOME
“This is where the smarts of CRO are – in identifying the
easiest stuff to test or fix that will drive the largest uplift.”
88
89. The Bucket Methodology
“Helps you to stream actions from the insights and prioritisation work.
Forces an action for every issue, a counter for every opportunity being lost.”
Test
If there is an obvious opportunity to shift behaviour, expose insight or
increase conversion – this bucket is where you place stuff for testing. If
you have traffic and leakage, this is the bucket for that issue.
Instrument
If an issue is placed in this bucket, it means we need to beef up the
analytics reporting. This can involve fixing, adding or improving tag or
event handling on the analytics configuration. We instrument both
structurally and for insight in the pain points we’ve found.
Hypothesise
This is where we’ve found a page, widget or process that’s just not working
well but we don’t see a clear single solution. Since we need to really shift
the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by
evidence and data, we’ll create test plans to find the answers to the
questions and change the conversion or KPI figure in the desired direction.
Just Do It
JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the
change is a no-brainer. Items marked with this flag can either be deployed
in a batch or as part of a controlled test. Stuff in here requires low effort
or are micro-opportunities to increase conversion and should be fixed.
Investigate You need to do some testing with particular devices or need more
information to triangulate a problem you spotted. If an item is in this
bucket, you need to ask questions or do further digging. 89
90. How is it working out?
• Methodologies are not Real Life ™
• It’s mainly about the mindset of the team and
managers, not the tools or methodologies they
play with
• Not all my clients have all the working parts
• You should not be a methodology slave
• Feel free to make your own or flexibly adapt
• Use some, any techniques instead of ‘guessing’
• Blending lean and agile with conversion
optimisation outcomes is my critical learning of
the last 5 years
• Doing rapid cycles of this outcome driven work
for Belron:
• World Conversion Rate Increase:
2009 +5%, 2010 +10%, 2011 +15%, 2012 +25%
• If you’d like to develop a good one for your
company, talk to me first!
• Don’t over complicate it. 90
Editor's Notes
“A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
“A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”
Create a suck index = pageviews * load time.
Here I show you some examples of well known brands, some of whom should know better. The larger the size of the page, the longer it will take to download and render on the device, especially when you don’t have perfect data conditions. The numberof requests also makes a difference, as it’s inefficient on mobile to open lots of connections like this. In short, the smaller the pagesize and number of requests you can aim for, the better. I’m patient with bad data connections but do people have the tolerance for 10-15 seconds on mobile? No – it has to happen much faster.
These are the results of a live test on a site, where an artificial delay is introduced in the performance testing. I’ve done some testing like this myself on desktop and mobile sites and confirm this is true – you’re increasing bounce rate, decreasing conversion, site engagement…It doesn’t matter what metric you use, performance equals MONEY or if not measured, a HUGE LOSS.
Performance also harms the lifeblood of e-commerce and revenue generating websites – repeat visitors! The gap here in one second of delay is enormous over time. You’re basically sucking a huge portion of potential business out of your site, with every additional bit of waiting time you add.
Add unique phone numbers to all your mobile sites and apps. That’s for starters.Then configure your analytics to collect data when people Click or Tap to make a phone call.Make sure you add other events like ringbacks, email, chat – any web forms or lead gen activity too.
So what does this graph say? That I have a long tail thing I want to talk to you about?No – this shows how much the ratio of phone to online conversion we have, by keyword.Some keywords generate nearly 25 times the call volume of others, which is a huge differential.This means that if you thought you got ‘roughly’ the same proportion of phone calls for different marketing activity, you are wrong.What this graph tells me is that the last 2 years of my stats are basically a big dog poo.
Add unique phone numbers to all your mobile sites and apps. That’s for starters.Then configure your analytics to collect data when people Click or Tap to make a phone call.Make sure you add other events like ringbacks, email, chat – any web forms or lead gen activity too.
Phone tracking costs you nothing – you can add it in a few minutes to your app or mobile website, by changing your analytics tracking.Now you can see exactly which bits of inbound marketing are driving telephone and other contact channelsIf you have any sort of phone component in your service or support, the insight could be vitalYou can take traffic by keyword, source, campaign or advert creative and work out the TRUE mix of conversion activityAnd all this is also available on Desktop too – by using dynamic numbers, we can track exactly the same stuff.Talk to this company : www.infinity-tracking.com
So what does this graph say? That I have a long tail thing I want to talk to you about?No – this shows how much the ratio of phone to online conversion we have, by keyword.Some keywords generate nearly 25 times the call volume of others, which is a huge differential.This means that if you thought you got ‘roughly’ the same proportion of phone calls for different marketing activity, you are wrong.What this graph tells me is that the last 2 years of my stats are basically a big dog poo.
“A piece of paper with your design mockup. A customer in a shop or bookstore. Their finger is their mouse, the paper their screen. Where would they click? Do they know what these labels mean? Do they see the major routes out of the page? Any barriers.Congratulations, you just got feedback on your design, before writing a single freaking line of code or asking your developers to keep changing stuff.”