Confessions of an uber optimiser conversion summit - craig sullivan - v 1.9

Post on 27-Jan-2015

104 views 0 download

Tags:

description

The top reasons your CRO isn't working - includes a large resource pack you can take away and use.

transcript

Confessions of an Uber-Optimiser

5th Sep 2013 @OptimiseOrDie

@OptimiseOrDie

@OptimiseOrDie

@OptimiseOrDie

Timeline

- 1998 1999 - 2004

2004-2008 2008-2012

Belron Brands

@OptimiseOrDie

SEO

@OptimiseOrDie

PPC UX Analytics

A/B and Multivariate testing

Customer Satisfaction

Design

QADevelopment

40+ websites, 34 countries, 19 languages, €1bn+ revenue

Performance

8 people

@OptimiseOrDie

Ahh, how it hurt

If you’re not a part of the solution, there’s good money to be made in prolonging the

problem

Out of my comfort zone…

@OptimiseOrDie

Behind enemy lines…

@OptimiseOrDie

Nice day at the office, dear?

@OptimiseOrDie

Competition…

Traffic is harder!

SEO/PPC

Panguin tool…

Casino Psychology

If it isn’t working, you’re not doing it right

@OptimiseOrDie

#1 : Your analytics is cattle trucked

@OptimiseOrDie

#1 : Your analytics is cattle trucked

@OptimiseOrDie

#1 : Common problems (GA)• Dual purpose goal page

– One page used by two outcomes – and not split

• Cross domain tracking– Where you jump between sites, this borks the data

• Filters not correctly set up– Your office, agencies, developers are skewing data

• Code missing or double code– Causes visit splitting, double pageviews, skews bounce rate

• Campaign, Social, Email tracking etc.– External links you generate are not setup to record properly

• Errors not tracked (404, 5xx, Other)– You are unaware of error volumes, locations and impact

• Dual flow funnels– Flows join in the middle of a funnel or loop internally

• Event tracking skews bounce rate– If an event is set to be ‘interactive’ – it can skew bounce rate

(example)

@OptimiseOrDie

20

#1 : Common problems (GA)– EXAMPLE

Landing 1st interaction Loss 2nd

interaction Loss 3rd interaction Loss 4th

interaction Loss

55900 527 99.1% 66 87.5% 55 16.7% 33 40.0%

30900 4120 86.7% 2470 40.0% 1680 32.0% 1240 26.2%

#1 : Solutions• Get a Health Check for your Analytics

– Try @prwd, @danbarker, @peter_oneill or ask me!

• Invest continually in instrumentation– Aim for at least 5% of dev time to fix +

improve• Stop shrugging : plug your insight gaps

– Change ‘I don’t know’ to ‘I’ll find out’• Look at event tracking (Google Analytics)

– If set up correctly, you get wonderful insights• Would you use paper instead of a till?

– You wouldn’t do it in retail so stop doing it online!

• How do you win F1 races?– With the wrong performance data, you won’t

@OptimiseOrDie

Insight - Inputs

#FAILCompetitor copying

Guessing

Dice rolling

An article the CEO read

Competitor change

PanicEgo

OpinionCherished notionsMarketing whimsCosmic raysNot ‘on brand’ enough

IT inflexibility

Internal company needs

Some dumbass consultant

Shiny feature blindness

Knee jerk reactons

#2 : Your inputs are all wrong

@OptimiseOrDie

Insight - Inputs

InsightSegmentation

Surveys

Sales and Call Centre

Session Replay

Social analytics

Customer contactEye tracking

Usability testingForms analyticsSearch analyticsVoice of Customer

Market research

A/B and MVT testing

Big & unstructured data

Web analytics

Competitor evals

Customer services

#2 : Your inputs are all wrong

@OptimiseOrDie

#2 : Solutions• Usability testing and User Centred design

– If you’re not doing this properly, you’re hosed

• Champion UX+ - with added numbers– (Re)designing without inputs + numbers is guessing

• You need one team on this, not silos– Stop handing round the baby (I’ll come back to this)

• Ego, Opinion, Cherished notions – fill gaps– Fill these vacuums with insights and data

• Champion the users– Someone needs to take their side!

• You need multiple tool inputs– Let me show you my core list

@OptimiseOrDie

#2 : Core tools• Properly set up analytics

– Without this foundation, you’re toast• Session replay tools

– Clicktale, Tealeaf, Sessioncam and more…• Cheap / Crowdsourced usability testing

– See the resource pack for more details• Voice of Customer / Feedback tools

– 4Q, Kampyle, Qualaroo, Usabilla and more… • A/B and Multivariate testing

– Optimizely, Google Content Experiments, VWO• Email, Browser and Mobile testing

– You don’t know if it works unless you check

@OptimiseOrDie

#3 : You’re not testing (enough)

@OptimiseOrDie

#3 : Common problems• Let’s take a quick poll

– How many tests do you complete a month?

• Not enough resource – You MUST hire, invest and ringfence time and staff for CRO

• Testing has gone to sleep– Some vendors have a ‘rescue’ team for these accounts

• Vanity testing takes hold– Getting one test done a quarter? Still showing it a year later?

• You keep testing without buyin at C-Level– If nobody sees the flower, was it there?

• You haven’t got a process – just a plugin– Insight, Brainstorm, Wireframe, Design, Build, QA test,

Monitor, Analyse. Tools, Process, People, Time -> INVEST

• IT or release barriers slow down work– Circumvent with tagging tools– Develop ways around the innovation barrier

@OptimiseOrDie

#4 : Not executing fast enough

@OptimiseOrDie

#4 : Not executing fast enough

• Silo Mentality means pass the product– No ‘one team’ approach means no ‘one product’

• The process is badly designed– See the resource pack or ask me later!

• People mistake hypotheses for finals– Endless argument, tweaking means NO TESTING – let

the test decide, please!

• No clarity : authority or decision making– You need a strong leader to get things decided

• Signoff takes far too long– Signoff by committee is a velocity killer – the CUSTOMER

and the NUMBERS are the signoff

• You set your target too low– Aim for a high target and keep increasing it @OptimiseOrD

ie

CRO

@OptimiseOrDie

#4 : Execution solutions• Agile, One Team approach

– Everyone works on the lifecycle, together

• Hire Polymaths– T-shaped or just multi-skilled, I hire them a lot

• Use Collaborative Tools, not meetings– See the resource pack

• Market the results– Market this stuff internally like a PR agency – Encourage betting in the office

• Smash down silos – a special mission– Involve the worst offenders in the hypothesis team– “Hold your friends close, and your enemies closer”– Work WITH the developers to find solutions– Ask Developers and IT for solutions, not apologies

@OptimiseOrDie

#5 : Product cycles are too long

0 6 12 18

Months

Conversion

@OptimiseOrDie

#5 : Solutions• Give Priority Boarding for opportunities

– The best seats reserved for metric shifters

• Release more often to close the gap– More testing resource helps, analytics ‘hawk eye’

• Kaizen – continuous improvement– Others call it JFDI (just f***ing do it)

• Make changes AS WELL as tests, basically!– These small things add up

• RUSH Hair booking – Over 100 changes– No functional changes at all – 37% improvement

• Inbetween product lifecycles?– The added lift for 10 days work, worth 360k

@OptimiseOrDie

#5 : Make your own cycles“Rather than try and improve one thing by 10% - which would be very, very difficult to do,

We go and find 1,000 things and improve them all by a fraction of a per cent, which is totally do-able.”Chris Boardman

@OptimiseOrDie

#6 – No Photo UX

24 Jan 2012

• Persuasion / Influence / Direction / Explanation

• Helps people process information and stories

• Vital to sell an ‘experience’

• Helps people recognise and discriminate between things

• Supports Scanning Visitors

• Drives emotional response

short.cx/YrBczl

• Very powerful and under-estimated area

• I’ve done over 20M visitor tests with people images for a service industry – some tips:

• The person, pose, eye gaze, facial expressions and body language – cause visceral emotional reactions and big changes in behaviour

• Eye gaze crucial – to engage you or to ‘point’

Photo UX

24 Jan 2012

• Negative body language is a turnoff • Uniforms and branding a positive (ball

cap) • Hands are hard to handle – use a prop to

help• For Ecommerce – tip! test bigger images!• Autoglass and Belron always use real

people• In most countries (out of 33) with strong

female and male images in test, female wins

• Smile and authenticity in these examples is absolutely vital

• So, I have a question for you

Photo UX

@OptimiseOrDie

+13.9%

@OptimiseOrDie

+5.9%

Terrible Stock Photos : headsethotties.com & awkwardstockphotos.comLaughing at Salads : womenlaughingwithsalad.tumblr.com

BBC Fake Smile Test : bbc.in/5rtnv@OptimiseOrD

ie

SPAIN

+22% over control

99% confidence

“It’s not about what you think when you look at the design – it’s about the reaction it causes in the mind of the viewer. Always design for that first.”

@OptimiseOrDie

@OptimiseOrDie

#7 : Your tests are cattle trucked• Many tests fail due to QA or browser

bugs– Always do cross browser QA testing – see resources

• Don’t rely on developers saying ‘yes’– Use your analytics to define the list to test

• Cross instrument your analytics– You need this to check the test software works

• Store the variant(s) seen in analytics– Compare people who saw A/B/A vs. A/B/B

• Segment your data to find variances– Failed tests usually show differences for segments

• Watch the test and analytics CLOSELY– After you go live, religiously check both– Read this article : stanford.io/15UYov0 @OptimiseOrD

ie

#8 : Stats are confusing

• Many testers & marketing people struggle– How long will it take to run the test?– Is the test ready?– How long should I keep it running for?– It says it’s ready after 3 days – is it?– Can we close it now – the numbers look great!

• A/B testing maths for dummies:– http://bit.ly/15UXLS4

• For more advanced testers:– Read this : http://bit.ly/1a4iJ1H

• I’m going to build a stats course– To explain all the common questions– To save me having to explain this crap all the time

@OptimiseOrDie

#9 : You’re not segmenting

• Averages lie– What about new vs. returning visitors?– What about different keyword groups?– Landing pages? Routes? Attributes

• Failed tests are just ‘averaged out’– You must look at segment level data– You must integrate the analytics + a/b test software

• The downside?– You’ll need more test data – to segment

• The upside?– Helps figure out why test didn’t perform– Finds value in failed or ‘no difference’ tests– Drives further testing focus

@OptimiseOrDie

#10 : You’re unichannel optimising

• Not using call tracking– Look at Infinity Tracking (UK)– Get Google keyword level call volumes!

• You don’t measure channel switchers– People who bail a funnel and call– People who use chat or other contact/sales

• You ‘forget’ mobile & tablet journeys– Walk the path from search -> ppc/seo -> site– Optimise for all your device mix & journeys

• You’re responsive– Testing may now bleed across device platforms– Changing in one place may impact many others– QA, Device and Browser testing even more vital

@OptimiseOrDie

SUMMARY : The best Companies….

• Invest continually in Analytics instrumentation, tools & people• Use an Agile, iterative, Cross-silo, One team project culture• Prefer collaborative tools to having lots of meetings• Prioritise development based on numbers and insight• Practice real continuous product improvement, not SLED• Source photos and copy that support persuasion and utility• Have cross channel, cross device design, testing and QA• Segment their data for valuable insights, every test or change• Continually try to reduce cycle (iteration) time in their process• Blend ‘long’ design, continuous improvement AND split tests• Make optimisation the engine of change, not the slave of ego• See the Maturity Model in the resource pack

@OptimiseOrDie

So you want examples?

• Belron – Ed Colley• Dell – Nazli Yuzak• Shop Direct – Paul Postance (now with EE)• Expedia – Oliver Paton• Schuh – Stuart McMillan• Soundcloud – Eleftherios Diakomichalis & Ole

Bahlmann• Gov.uk – Adam Bailin (now with the BBC)

Read the gov.uk principles : www.gov.uk/designprinciples

And my personal favourite of 2013 – Airbnb!

@OptimiseOrDie

• You’re in the right place today!• This work is rarely easy – it always involves

doing LOTS of things, not just one test a quarter

• Invest in people, tools, analytics, techniques but most of all a process and strategy – cool tools are not enough

• Stop putting things in the next release (JFDI)• This is not a bolt-on – it IS the process• Don’t be afraid to fail – you’re learning• Be Brave, Be Bold and most importantly• Never ever ever EVER give up• Enjoy the wonderful lineup….

Don’t panic!

49

Is there a way to fix this then?Conversion Heroes!

@OptimiseOrDie

50

Email

Twitter

:sullivac@gmail.com

:@OptimiseOrDie

:linkd.in/pvrg14

More reading. Download the slides! Questions…

51

RESOURCE PACK

52

RESOURCE PACK

• Maturity model• Crowdsourced UX• Collaborative tools• Testing tools for CRO & QA• Belron methodology example• CRO and testing resources

53

Ad Hoc

Local HeroesChaotic Good

Level 1Starter Level

GuessingA/B testingBasic tools

AnalyticsSurveys

Contact CentreLow budget

usability

Outline process

Small teamLow hanging

fruit

+ Multi variateSession replayNo segments

+Regular usability

testing/research

PrototypingSession replay

Onsite feedback

_____________________________________________________________________________________________ _

Dedicated team

Volume opportunities

Cross silo teamSystematic

tests

Ninja TeamTesting in the

DNA

Well developed Streamlined Company wide

+Funnel optimisationCall tracking

Some segments Micro testing

Bounce ratesBig volume

landing pages

+ Funnel analysis

Low converting & High loss

pages

+ offline integration

Single channel picture

+ Funnel fixesForms analytics

Channel switches

+Cross channel testing

Integrated CRO and analyticsSegmentation

+Spread tool use

Dynamic adaptive targetingMachine learningRealtime

Multichannel funnels

Cross channel synergy

_______________________________________________________________________________________________

________________________________________________________________________________________________

Testing

focus

Culture

Process

Analytics

focus

Insightmethod

s

+User Centered DesignLayered feedback

Mini product tests

Get buyin

________________________________________________________________________________________________Missio

nProve ROI Scale the

testing Mine valueContinual

improvement

+ Customer sat scores tied to

UXRapid iterative

testing and design

+ All channel view of

customerDriving offline using online

All promotion driven by testing

Level 2Early maturity

Level 3Serious testing

Level 4Core business value

Level 5You rock, awesomely

________________________________________________________________________________________________

54

Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)Usertesting (B) www.usertesting.comUserlytics (B) www.userlytics.comUserzoom (S) www.userzoom.comIntuition HQ (S) www.intuitionhq.comMechanical turk (S) www.mechanicalturk.comLoop11 (S) www.loop11.comOpen Hallway (S) www.openhallway.comWhat Users Do (P) www.whatusersdo.comFeedback army (P) www.feedbackarmy.comUser feel (P) www.userfeel.comEthnio (For Recruiting) www.ethnio.com

Feedback on Prototypes / MockupsPidoco www.pidoco.comVerify from Zurb www.verifyapp.comFive second test www.fivesecondtest.comConceptshare www.conceptshare.comUsabilla www.usabilla.com

2 - UX Crowd tools

55

3 - Collaborative Tools

Oh sh*t

56

3.1 - Join.me

57

3.2 - Pivotal Tracker

58

3.3 – Trello

59

3.4 - Basecamp

60

• Lots of people don’t know this• Serious time is getting wasted on pulling and preparing data• Use the Google API to roll your own reports straight into Big G• Google Analytics + API + Google docs integration = A BETTER LIFE!• Hack your way to having more productive weeks• Learn how to do this to make completely custom reports

3.5 - Google Docs and Automation

61

• LucidChart

3.6 - Cloud Collaboration

62

• Webnotes

3.7 - Cloud Collaboration

63

• Protonotes

3.8 - Cloud Collaboration

64

• Conceptshare

3.9 - Cloud Collaboration

65

4 – QA and Testing tools

Email testing www.litmus.comwww.returnpath.comwww.lyris.com

Browser testing www.crossbrowsertesting.comwww.cloudtesting.comwww.multibrowserviewer.comwww.saucelabs.com

Mobile devices www.perfectomobile.comwww.deviceanywhere.comwww.mobilexweb.com/emulatorswww.opendevicelab.com

66

5 – Méthodologies - Lean UX

Positive– Lightweight and very fast methods– Realtime or rapid improvements– Documentation light, value high– Low on wastage and frippery– Fast time to market, then optimise– Allows you to pivot into new areas

Negative– Often needs user test feedback to

steer the development, as data not enough

– Bosses distrust stuff where the outcome isn’t known

“The application of UX design methods into product development, tailored to fit Build-Measure-Learn cycles.”

67

5 - Agile UX / UCD / Collaborative Design

Positive– User centric– Goals met substantially– Rapid time to market (especially when

using Agile iterations)

Negative– Without quant data, user goals can

drive the show – missing the business sweet spot

– Some people find it hard to integrate with siloed teams

– Doesn’t’ work with waterfall IMHO

Wireframe

Prototype

TestAnalyse

Concept

Research

“An integration of User Experience Design and Agile* Software Development Methodologies”

*Sometimes

68

CRO

69

5 - Lean Conversion Optimisation

Positive– A blend of several techniques– Multiple sources of Qual and Quant data aids triangulation– CRO analytics focus drives unearned value inside all

products

Negative– Needs a one team approach with a strong PM who is a

Polymath (Commercial, Analytics, UX, Technical)– Only works if your teams can take the pace – you might be

surprised though!

“A blend of User Experience Design, Agile PM, Rapid Lean UX Build-Measure-Learn cycles, triangulated data sources, triage and prioritisation.”

70

5 - Lean CROInspection

Immersion

Identify

Triage & Triangulate

Outcome Streams

Measure

Learn

Instrument

71

5 - Triage and Triangulation

• Starts with the analytics data• Then UX and user journey walkthrough from SERPS -> key paths• Then back to analytics data for a whole range of reports:• Segmented reporting, Traffic sources, Device viewport and

browser, Platform (tablet, mobile, desktop) and many more• We use other tools or insight sources to help form hypotheses• We triangulate with other data where possible• We estimate the potential uplift of fixing/improving something

as well as the difficulty (time/resource/complexity/risk)• A simple quadrant shows the value clusters• We then WORK the highest and easiest scores by…• Turning every opportunity spotted into an OUTCOME

“This is where the smarts of CRO are – in identifying the easiest stuff to test or fix that will drive the largest uplift.”

72

5 - The Bucket Methodology“Helps you to stream actions from the insights and prioritisation work. Forces an action for every issue, a counter for every opportunity being lost.”

Test If there is an obvious opportunity to shift behaviour, expose insight or increase conversion – this bucket is where you place stuff for testing. If

you have traffic and leakage, this is the bucket for that issue.

InstrumentIf an issue is placed in this bucket, it means we need to beef up the

analytics reporting. This can involve fixing, adding or improving tag or event handling on the analytics configuration. We instrument both

structurally and for insight in the pain points we’ve found.

Hypothesise This is where we’ve found a page, widget or process that’s just not working well but we don’t see a clear single solution. Since we need to really shift the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by

evidence and data, we’ll create test plans to find the answers to the questions and change the conversion or KPI figure in the desired direction.

Just Do It JFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the change is a no-brainer. Items marked with this flag can either be deployed in a batch or as part of a controlled test. Stuff in here requires low effort or are micro-opportunities to increase conversion and should be fixed.

Investigate You need to do some testing with particular devices or need more information to triangulate a problem you spotted. If an item is in this

bucket, you need to ask questions or do further digging.

5 - Belron example – Funnel replacementFinal

prototype

Usability issues

left

Final changes

Release build

Legal review kickoff

Cust services review kickoff

Marketing review

Test Plan

Signoff (Legal, Mktng, CCC)

Instrument

analytics

Instrument

Contact Centre

Offline tagging

QA testing

End-End testing

Launch 90/10%

MonitorLaunch 80/20%

Monitor < 1

week

Launch 50/50%

Go live 100%

Analytics review

Washup and

actions

New hypothes

es

New test design

Rinse and

Repeat!

74

6 - CRO and Testing resources• 101 Landing page tips : slidesha.re/8OnBRh • 544 Optimisation tips : bit.ly/8mkWOB• 108 Optimisation tips : bit.ly/3Z6GrP• 32 CRO tips : bit.ly/4BZjcW• 57 CRO books : bit.ly/dDjDRJ• CRO article list : bit.ly/nEUgui• Smashing Mag article : bit.ly/8X2fLk

75

END SLIDESFeel free to steal, re-use, appropriate or otherwise lift stuff from this deck.

If it was useful to you – email me or tweet me and tell me why – I’d be DELIGHTED to hear!

Regards,

Craig.