+ All Categories
Home > Technology > E Metrics Summit May 2006

E Metrics Summit May 2006

Date post: 28-Nov-2014
Category:
Upload: mpgildea
View: 329 times
Download: 2 times
Share this document with a friend
Description:
 
37
[email protected] 05-May-06 1 Form improvements Patricia Gildea, e-Delivery Manager, npower.com E-metrics Summit, London, May 5, 2006 Web analytics in every decision: from micro to macro
Transcript
Page 1: E Metrics Summit May 2006

[email protected] 05-May-061

Form improvements

Patricia Gildea, e-Delivery Manager, npower.com

E-metrics Summit, London, May 5, 2006

Web analytics in every decision:

from micro to macro

Page 2: E Metrics Summit May 2006

[email protected] 05-May-062

• npower - one of top UK utility companies

• Serving residential & business customers

• Website content & functionality includes:– Brand engagement, sponsorship, etc.– Corporate info– Marketing & sales – Customer service– Social action programmes, education, etc.

Overview

Page 3: E Metrics Summit May 2006

[email protected] 05-May-063

• Over 5 year life of this brand & site, we’ve moved from:– Log files, to– Basic web trends package, to– [unnamed] analytics package to– Red Eye managed service

npower & web analytics

Page 4: E Metrics Summit May 2006

[email protected] 05-May-064

• Why a managed service?– Very small team at the time– Little expertise in e-metrics– Business required extensive support in

learning curve and ongoing reporting

• New vendor selected with managed service one year ago– Red Eye been instrumental in moving us

forward

npower & web analytics

Page 5: E Metrics Summit May 2006

[email protected] 05-May-065

npower & web analytics

• So how is it used now?– On average 3-10x week– Daily micro-decisions by web delivery team– Meso-design considerations on site journeys,

sections, commercials– Macro-decisions on strategy & site structure– All examples here will be residential

Page 6: E Metrics Summit May 2006

[email protected] 05-May-066

At the micro level

Examples - micro-decisions using analytics:

- Prioritising bug-fixing

- Prioritising browser support

- Retiring v. updating pages

- Navigation exposure

Page 7: E Metrics Summit May 2006

[email protected] 05-May-067

At the micro level

• Circular journeys, frustrated feedback– Case study: Contact Us

• Minor text changes for increase in conversion rates– Case study: “just skip it”

Page 8: E Metrics Summit May 2006

[email protected] 05-May-068

Case study: Contact Us

• Noticed increase in website feedback asking for information that was already in Contact Us section of site.

• Analysed most popular paths - found circularities

• Then identified key area of user confusion

Page 9: E Metrics Summit May 2006

[email protected] 05-May-069

Case study: Contact Us

• Before– “Electricity and Gas contacts” link not highly

used, but should be

– Details unintentionally buried one level down

Page 10: E Metrics Summit May 2006

[email protected] 05-May-0610

Case study: Contact Us

• Redesign:– Move these contact details up a level– Reorder link lists and hierarchy of customer

service section to reflect most common areas of usage

– New wireframe prepared, pages rebuilt– Section streamlined

Page 11: E Metrics Summit May 2006

[email protected] 05-May-0611

Case study: “just skip it”

• Focus on attrition rates through application form ahead of planned significant increase in e-marketing spend

• Plug holes in “leaky bucket”

• Application form: 7 steps

• Largest page-to-page attrition: step 4 to 5

Page 12: E Metrics Summit May 2006

[email protected] 05-May-0612

Supply Number is not mandatory.

Meter point reference number is not mandatory.

Case study: “just skip it”

Page 13: E Metrics Summit May 2006

[email protected] 05-May-0613

Case study: “just skip it”

Added:

“Don’t know this? Just skip it.”

Added:

“Don’t know this? Just skip it.”

Page 14: E Metrics Summit May 2006

[email protected] 05-May-0614

Case study: “just skip it”

Results

• 3% improvement on this step alone due to just this tiny change

• Cautionary note!• Use judiciously as similar use may increase deletions, churn, increase back-office costs.

Page 15: E Metrics Summit May 2006

[email protected] 05-May-0615

At the meso-design level

• Case study: connecting journeys

– Core acquisition journey for residential supply signups

– Savings calculator compares npower prices against existing supplier

– Application form (electronic contract)

Page 16: E Metrics Summit May 2006

[email protected] 05-May-0616

Case study: connecting journeys

• Part of attrition rates study ahead of increase in e-marketing spend

• Proposal: Connect application form to savings calculator

• Purpose: reduce attrition, increase conversion through reduced user inputs, reduced opportunity to exit journey

Page 17: E Metrics Summit May 2006

[email protected] 05-May-0617

Case study: connecting journeys

After connection

Step 1 to 2 Increase of 28.84%

Step 2 to 3 3.72%

Step 3 to 4 2.13%

Step 4 to 5 20.98%

Step 5 to 6 2.54%

Step 6 to 7 3.09%

% Users moving between:

Page 18: E Metrics Summit May 2006

[email protected] 05-May-0618

Case study: connecting journeys

• Looks pretty good, right?

• Sales crashed by over 50%!

• Why??

• Two reasons: price and required data

Page 19: E Metrics Summit May 2006

[email protected] 05-May-0619

Case study: connecting journeys

Price issues:• All users then forced through calculator• At that time, we were not aggressively

competitive on price in this channel• Therefore all users (= prospects) were exposed

to pricing strategy• Only small numbers of areas/payment

methods/consumption journeys completed

Page 20: E Metrics Summit May 2006

[email protected] 05-May-0620

Case study: connecting journeys

Required data issues:• All users forced through calculator• Therefore, to sign up, user (= prospects) must

now have to know current supplier, current tariff, current spend/consumption

• Also, users/prospects who wanted to sign up not on savings but values, brand, sponsorship were forced through irrelevant savings journey

Page 21: E Metrics Summit May 2006

[email protected] 05-May-0621

Case study: connecting journeys

• Since then, Sign Online tariff was introduced (very competitive)

• Journey options further developed where benefit of connection maintained for reduced burden on user, but calculator usage not forced

Page 22: E Metrics Summit May 2006

[email protected] 05-May-0622

Case study: connecting journeys

Connect Disconnect

Step 1 to 2 28.84% 3.17%

Step 2 to 3 3.72% -1.56%

Step 3 to 4 2.13% -1.85%

Step 4 to 5 20.98% -2.15%

Step 5 to 6 2.54% -0.26%

Step 6 to 7 3.09% 2.76%

Increase in % Users moving between:

Page 23: E Metrics Summit May 2006

[email protected] 05-May-0623

Case study: connecting journeys

• Total volume of sales increased

• Decrease in user complaints about wanting to sign up on brand/values (e.g. green) but being forced to calculate savings

• Also decrease in complaints about not having arcane details to hand (e.g. tariff)

Page 24: E Metrics Summit May 2006

[email protected] 05-May-0624

At the macro-analysis level

• Largest ‘leaky bucket’ holes plugged (ongoing development project)

• Time to start pumping volumes into site

• E-marketing campaigns: banners & skyscrapers, email and PPC

• First significant campaigns launched

Page 25: E Metrics Summit May 2006

[email protected] 05-May-0625

Case study: early campaigns

• Typical banner campaign set-up• Commercial success measurement of

Cost per Contract (CPC) plus volume• Results include low cost per arrival, high

click-through to first step of calculator or application form (disconnected journeys) BUT poor CPC and volumes

• Anecdotal evidence - post-impression issues

Page 26: E Metrics Summit May 2006

[email protected] 05-May-0626

Case study: later campaigns

• Subsequent campaign trials included post-impression, post-session behaviour measurement

• How do consumers actually buy electricity & gas online?

• The “considered purchase” debate

Page 27: E Metrics Summit May 2006

[email protected] 05-May-0627

Case study: later campaigns

• Banners– <1% of ads served resulted in a click. <1% of arrival

converted.– However, ‘post impression’ customer acquisition

increases by 500%

• PPC – 3.7% of arrivals result in a contract– However, the lifetime of the visit result in a 4.5%

conversion rate and increased sales of 22%

Page 28: E Metrics Summit May 2006

[email protected] 05-May-0628

Case study: later campaigns

• Now challenges are to further understand measuring post session behaviour

• Issue of integrating RedEye metrics with multiple other campaign vendor tagging & measurements (e.g. MSN)

Page 29: E Metrics Summit May 2006

[email protected] 05-May-0629

Case study: homepage strategy

• npower.com serves B2C, B2C and corp

• Homepage had become a free-for-all; no clear strategy, no clear priorities in use of real estate

• “Squeaky-wheel” design

Page 30: E Metrics Summit May 2006

[email protected] 05-May-0630

Page 31: E Metrics Summit May 2006

[email protected] 05-May-0631

Case study: homepage strategy

• Strategy project affirmed:– npower.com is retail-level asset– Homepage need balance & simplification in

structure– ‘Challenger’ brand campaign required re-

branding– Significantly reduced real-estate allocations

required set of decision-rules to manage

Page 32: E Metrics Summit May 2006

[email protected] 05-May-0632

Page 33: E Metrics Summit May 2006

[email protected] 05-May-0633

Case study: homepage strategy

• Decision-rules– Every campaign, new product, promotion or

initiative should have a predicted NPV and predicted web usage (from business case).

– Replacement rules are based on a comparison of predicted NPV and usage with adjusted NPV and usage, and then against predicted NPV and usage of the new initiative.

Page 34: E Metrics Summit May 2006

[email protected] 05-May-0634

Case study: homepage strategy

For example:Campaign A has an NPV = 10 & usage =1000/wk.

>Launched on day 1. +10 days, Campaign B briefed in for launch on day 30. Campaign A’s NPV and usage is then adjusted basedon actuals from day 1 to 15, projected to day 30 and compared against predicted NPV and usage for B.To replace A, B must be predicted to outperform A.

If yes, then B replaces A. However, B is monitored and if B doesn’t outperform the predicted A metrics, then B could be pulled and replaced with A.

Page 35: E Metrics Summit May 2006

[email protected] 05-May-0635

Case study: homepage strategy

• Decision-rules– Web analytics critical to analysis on predicted

and adjusted usage– Web analytics become the lens on reality,

combating “audience of one” decision-making

• Measuring success of new homepage– Benchmarks and comparison reports of

before and after

Page 36: E Metrics Summit May 2006

[email protected] 05-May-0636

Case study: homepage strategy

• We will be measuring:– Immediate exits from homepage including

duration– Top 10 journeys completion rate– Customer frustration level– “Wandering journeys”– Split click-throughs of B2C v. B2B

Page 37: E Metrics Summit May 2006

[email protected] 05-May-0637

The future

• Challenges will include– Measuring segmented journeys with targeted

content: “Conversion Enhancement”– Converting to dynamic content management

system and measuring dynamic pages– Measuring multi-channel experiences &

journeys


Recommended