Seeing the Elephant: Defragmenting User Research

Post on 17-Aug-2014

44,566 views 3 download

Tags:

description

Presented at UXPA Boston May 2014, Interaction S.A. (Recife, Brasil) November 2013 and Intuit (Mountain View, CA, USA) October 2013; earlier version given in 2013 in NYC at Designers + Geeks. Given in 2012 at UX Russia (http://uxrussia.com/), UX Hong Kong (http://www.uxhongkong.com/) and WebVisions NYC (http://www.webvisionsevent.com/new-york/). Given in 2011 at the IA Summit (http://2011.iasummit.org/), UX-Lisbon (http://ux-lx.com), and Love at First Website (http://www.isitedesign.com/love/). This is something of a successor to my talk "Marrying Web Analytics and User Experience" (http://is.gd/vK34zS)

transcript

Seeing the ElephantDefragmenting User Research

Lou Rosenfeld •  lou@rosenfeldmedia.com UXPA Boston • May 15, 2014

November 14, 2013: User Researcher-in-Chief Barack Obama

November 14, 2013: User Researcher-in-Chief Barack Obama

What does victory look like?

User research in today’s organization

Reports from the user research group

Query data gleaned fromsite search team

XXX.XXX.X.104 - - [10/Jul/2013:10:25:46 -0800] "GET /search?access=p&entqr=0&output=xml_no_dtd&sort=date%3AD%3AL%3Ad1&ud=1&site=AllSites&ie=UTF-8&client=www&oe=UTF-8&proxystylesheet=www&q=lincense+plate&ip=XXX.XXX.X.104 HTTP/1.1" 200 971 0 0.02

XXX.XXX.X.104 - - [10/Jul/2013:10:25:48 -0800] "GET /search?access=p&entqr=0&output=xml_no_dtd&sort=date%3AD%3AL%3Ad1&ie=UTF-8&client=www&q=license+plate&ud=1&site=AllSites&spell=1&oe=UTF-8&proxystylesheet=www&ip=XXX.XXX.X.104 HTTP/1.1" 200 8283 146 0.16

XXX.XXX.XX.130 - - [10/Jul/2006:10:24:38 -0800] "GET /search?access=p&entqr=0&output=xml_no_dtd&sort=date%3AD%3AL%3Ad1&ud=1&site=AllSites&ie=UTF-8&client=www&oe=UTF-8&proxystylesheet=www&q=regional+transportation+governance+commission&ip=XXX.XXX.X.130 HTTP/1.1" 200 9718 62 0.17

Logs from thecall center

Reports fromanalytics applications

Insights fromVoice of the Customer research

Reports from CRM applications

Papers from the research center

One agency’suser mental model

Another agency’sbrand architecture research

Surveys behind Net Promoter Score

So why does so much design still SUCK?

The blind men and the elephant

What Why

Methods employed: quantitative versus qualitative

Goals: help org or users

Organizational goals

Users’ goals

How they use data: measuring world we know versus world we don’t

Measuring the world we know

Exploring the world we don’t

Kind of data they use: statistical vs. descriptive

Descriptive dataStatistical data

Lou’s TABLE OF OVERGENERALIZED

DICHOTOMIESWeb Analytics User Experience

What they analyze

Users' behaviors (what's happening)

Users' intentions and motives (why those things happen)

What methods they employ

Quantitative methods to determine what's happening

Qualitative methods for explaining why things happen

What they're trying to achieve

Helps the organization meet goals (expressed as KPI)

Helps users achieve goals (expressed as tasks or topics of interest)

How they use data

Measure performance (goal-driven analysis)

Uncover patterns and surprises (emergent analysis)

What kind of data they use

Statistical data ("real" data in large volumes, full of errors)

Descriptive data (in small volumes, generated in lab environment, full of errors)

Four themes for getting to synthesis

1. Balance

2. Cadence

3. Conversation

4. Perspective

1. Balance

Rohrer/Mulder/Yaar’s map

Qualitative (direct) Quantitative (indirect)

Attitudinal

Behavioral

© 2008 Christian Rohrer

Approach

Data

So

urc

e

mix

mix

Scripted (often lab-based) use of product

Natural use of product De-contextualized / not using product

Key for Context of Product Use during data collection

Combination / hybrid

Focus Groups

Phone Interviews

Ethnographic Field Studies

Cardsorting

Diary/Camera Study

Intercept Surveys

Usability Lab Studies

Eyetracking

Usability Benchmarking (in lab)

A/B (Live) Testing

Online User Experience Assessments (“Vividence-like” studies)

Desirability studies

Data Mining/Analysis

Email Surveys

Message Board Mining

Participatory Design Customer feedback via email

/

/

20

Landscape of User Research Methods 

Text

Christian Rohrer: http://bit.ly/eAlbe2 / Steve Mulder & Ziv Yaar, The User Is Always Right

XXX.XXX.X.104 - - [10/Jul/2006:10:25:46 -0800] "GET /search?access=p&entqr=0&output=xml_no_dtd&sort=date%3AD%3AL%3Ad1&ud=1&site=AllSites&ie=UTF-8&client=www&oe=UTF-8&proxystylesheet=www&q=lincense+plate&ip=XXX.XXX.X.104 HTTP/1.1" 200 971 0 0.02

XXX.XXX.X.104 - - [10/Jul/2006:10:25:48 -0800] "GET /search?access=p&entqr=0&output=xml_no_dtd&sort=date%3AD%3AL%3Ad1&ie=UTF-8&client=www&q=license+plate&ud=1&site=AllSites&spell=1&oe=UTF-8&proxystylesheet=www&ip=XXX.XXX.X.104 HTTP/1.1" 200 8283 146 0.16

Web analytics’ question:“Are we converting license plate renewals?”

UX practitioner’s question:“What are people searching the most?”

Balanced analysis

Balance within a method

Balance within a method

2. Cadence

A research cadence from Whitney Quesenbery

Cadence

WeeklyWeeklyWeekly

Call center data trend analysis 2 – 4 hours behavioral/quantitative

Task analysis 4 – 6 hours behavioral/quantitative

QuarterlyQuarterlyQuarterly

Exploratory analysis of site analytics data 8 – 10 hours behavioral/qualitative

User survey 16 – 24 hours attitudinal/quantitative

AnnuallyAnnuallyAnnually

Net Promoter Score study 3 – 4 days attitudinal/quantitative

Field study 4 – 5 days behavioral/qualitative

Cadence

WeeklyWeeklyWeekly

Call center data trend analysis 2 – 4 hours behavioral/quantitative

Task analysis 4 – 6 hours behavioral/quantitative

QuarterlyQuarterlyQuarterly

Exploratory analysis of site analytics data 8 – 10 hours behavioral/qualitative

User survey 16 – 24 hours attitudinal/quantitative

AnnuallyAnnuallyAnnually

Net Promoter Score study 3 – 4 days attitudinal/quantitative

Field study 4 – 5 days behavioral/qualitative

Cadence + Balance

Cadence

WeeklyWeeklyWeekly

Call center data trend analysis 2 – 4 hours behavioral/quantitative

Task analysis 4 – 6 hours behavioral/quantitative

QuarterlyQuarterlyQuarterly

Exploratory analysis of site analytics data 8 – 10 hours behavioral/qualitative

User survey 16 – 24 hours attitudinal/quantitative

AnnuallyAnnuallyAnnually

Net Promoter Score study 3 – 4 days attitudinal/quantitative

Field study 4 – 5 days behavioral/qualitative

Cadence + Balance

3. Conversation

Develop a pidgin

Dave Gray’s boundary matrix: http://bit.ly/gWoZQm

KPI

goals

segments

personas

Ban words that impede conversations

• Product names: Omniture,, SharePoint...

• Methods: focus group,, usability test...

• Departments: market research,, analytics...

• Disciplines: business analysis,, information architecture...

• Outcomes: portal, social media layer...

Tell Stories

Tell Stories

SKU: #39072-2AH1

Buy Candy for Strangers

4. Perspective

Maps help us make sense by seeing things in new ways

Rohrer/Mulder/Yaar’s map

Qualitative (direct) Quantitative (indirect)

Attitudinal

Behavioral

© 2008 Christian Rohrer

Approach

Data

So

urc

e

mix

mix

Scripted (often lab-based) use of product

Natural use of product De-contextualized / not using product

Key for Context of Product Use during data collection

Combination / hybrid

Focus Groups

Phone Interviews

Ethnographic Field Studies

Cardsorting

Diary/Camera Study

Intercept Surveys

Usability Lab Studies

Eyetracking

Usability Benchmarking (in lab)

A/B (Live) Testing

Online User Experience Assessments (“Vividence-like” studies)

Desirability studies

Data Mining/Analysis

Email Surveys

Message Board Mining

Participatory Design Customer feedback via email

/

/

20

Landscape of User Research Methods 

Text

Christian Rohrer: http://bit.ly/eAlbe2 / Steve Mulder & Ziv Yaar, The User Is Always Right

Avinash Kaushik’s visualization (from Web Analytics 2.0)

Avinash Kaushik’s visualization (from Web Analytics 2.0)

Avinash Kaushik’s visualization (from Web Analytics 2.0)

“...while I have a bucket for ‘Voice of Customer,’ in hindsight I should have worked harder still to paint the full

qual and quant picture....”

Containers help us make sense by doing things in new ways

MailChimp’s UX team: drowning in data

• Analytics

• Account closing surveys

• Blog comments

• Competitor news

• Delivery stats

• Industry research

• Release notes

• Support data

MailChimp + Evernote

• Shared bucket of buckets (60 notebooks)

• Email is the API

• OCR’d (nice for SurveyMonkey reports)

• Searchable!

• Led to “regular data nerd lunches”

MailChimp is on the threshold of synthesis

Map + Container = Dashboard

Map + Container = Dashboard ?

A very helpful book

Helps decision-makers understand that silos are

your problem—and theirs too

A parting question

If you were going to build your organization’s brain—its decision-making capability—from scratch...

What would it look like?

Thanks!

slides: http://rfld.me/11FrI3o

article:http://rfld.me/145ZccP

Lou Rosenfeld

Rosenfeld Media 

www.louisrosenfeld.com • @louisrosenfeldwww.rosenfeldmedia.com • @rosenfeldmedia