Date post: | 10-May-2015 |
Category: |
Technology |
Upload: | mounia-lalmas |
View: | 1,876 times |
Download: | 2 times |
An Engaging Cl ick M o u n i a L a l m a s Ya h o o L a b s L o n d o n S e a r c h S o l u t i o n s 2 0 1 3
This talk
What is user engagement? What are the characteristics of user engagement? How to measure user engagement? What is user engagement in web search? What is an engaging click?
1. inter-session 2. online multi-tasking 3. downstream engagement 4. serendipity
work on user engagement across
web applications
implications to web search
This talk What is user engagement? What are the characteristics of user engagement? How to measure user engagement? What is user engagement in web search? What is an engaging click?
1. inter-session 2. online multi-tasking 3. downstream engagement 4. serendipity
Work on user engagement across
web applications
Implications to search
What is user engagement?
User engagement is a quality of the user experience that emphasizes the positive aspects of interaction – in particular the fact of being captivated by the technology (Attfield et al, 2011).
user feelings: happy, sad, excited, …
emotional, cognitive and behavioural connection that exists, at any point in time and over time, between a user and a technological resource
user interactions: click, read, comment, buy…
user mental states: flow, presence, immersion, …
Why is it important to engage users?
§ In today’s wired world, users have enhanced expectations about their interactions with technology
… resulting in increased competition amongst the purveyors and designers of interactive systems. § In addition to utilitarian factors, such as usability, we must
consider the hedonic and experiential factors of interacting with technology, such as fun, fulfillment, play, and user engagement.
(O’Brien, Lalmas & Yom-Tov, 2013)
Characteristics of user engagement
Novelty (Webster & Ho, 1997; O’Brien,
2008)
Richness and control (Jacques et al, 1995; Webster &
Ho, 1997)
Aesthetics (Jacques et al, 1995; O’Brien,
2008)
Endurability (Read, MacFarlane, & Casey,
2002; O’Brien, 2008)
Focused attention (Webster & Ho, 1997; O’Brien,
2008)
Reputation, trust and expectation (Attfield et al,
2011)
Positive Affect (O’Brien & Toms, 2008)
Motivation, interests, incentives, and benefits (Jacques et al., 1995; O’Brien &
Toms, 2008)
(O’Brien, Lalmas & Yom-Tov, 2013)
Measuring user engagement Measures Characteristics
Self-reported engagement
Questionnaire, interview, report, product reaction cards, think-aloud
Subjective Short- and long-term Lab and field Small-scale
Cognitive engagement
Task-based methods (time spent, follow-on task) Physiological measures (e.g. EEG, SCL, fMRI, eye tracking, mouse-tracking)
Objective Short-term Lab and field Small-scale and large-scale
Interaction engagement
Web analytics metrics + models
Objective Short- and long-term Field Large-scale
User engagement in web search – around “returning relevant results to the users” satisfying user information needs
§ Clickthrough rate (CTR) § Dwell time (on landing page)
§ Time to first click § Skipping
§ Abandonment rate § Number of query reformulations § Search engine switching § Cumulative gain family of
metrics, precision at rank k, …
§ Multimedia search activities often driven by entertainment needs, not by information needs.
§ Displaying rich information on result pages (restaurant phone number) means that users do not need to click.
(Slaney, 2011)
This talk What is user engagement? What are the characteristics of user engagement? How to measure user engagement? What is user engagement in web search? What is an engaging click?
1. inter-session 2. online multi-tasking 3. downstream engagement 4. serendipity
work on user engagement across
web applications
implications to web search
• Domain: Yahoo Answers Japan • Study: Inter-session engagement metric
(Dupret & Lalmas, 2013)
If users find a web application interesting, engaging or useful, they will return to it sooner.
Absence time and survival analysis
survival Analysis: high hazard rate = short absence
short absence is a sign of loyalty
important indication of user engagement
Using absence time to compare 6 ranking functions (buckets) on Yahoo Answers Japan
1. Returning relevant results is important, but is not enough to keep returning to the search application
2. Clicks after the 5th results reflect poorer user experience; users cannot find what they are looking for
3. No click means a bad user experience 4. Clicking lower in the ranking suggests more careful choice
from the user 5. Clicking at bottom is a sign of low quality overall ranking 6. Users finding their answers quickly (click sooner) return
sooner to the search application 7. Returning to the same search result page is a worse user
experience than reformulating the query.
Endurability
• Domain: 700+ web applications • Study: Online multi-tasking
(Lehmann et al, 2013)
Online multi-tasking affects the way users interact (or engage) with sites.
Online multi-tasking – and search
181K users, 2 months browser data, 600 sites, 4.8M sessions • only 40% of the sessions have no site revisitation
• commonly accessed sites between visits à search 22%, navigation 12%, social 8% • for some sites (e-commerce) same sites are accessed between visits à one task? • no patterns for sites such as mail, social à anchor, habit?
• longer time between visits à a different task (new search) • more vs less times spent at each revisit à increased vs shift of attention
Revisitation patterns auction sites [complex attention]
●
●
●●
●
●●
●
●
10
11
12
1 2 3 4 5 6 7 8 9
p-value = 0.24m = 0.142
100% 67% 54% 46% 41% 35% 31% 29% 26%
search sites [increasing attention]
●●
●●
● ●
●
●
●
10.8
11.0
11.2
1 2 3 4 5 6 7 8 9
100% 69% 54% 44% 38% 33% 29% 26% 23%
p-value < 0.05m = 0.063
●
●
●
● ●
●●
● ●10.8
11.2
1 2 3 4 5 6 7 8 9
p-value < 0.05����������
100% 54% 36% 26% 20% 17% 14% 12% 10%proportion of users
% o
f to
tal page v
iew
s o
n s
ite
% o
f n
avig
atio
n t
yp
e
Hyperlinking
mail sites [decreasing attention]
●
●●
● ●
●●
●●1
011
12
13
1 2 3 4 5 6 7 8 9
100% 62% 41% 29% 21% 16% 13% 10% 8%
p-value < 0.05m = -0.288
average attention
1 2 3 4 5 6 7 8 90.0
0.4
0.8
0.0
0.4
0.8
1 2 3 4 5 6 7 8 9 0.0
0.4
0.8
1 2 3 4 5 6 7 8 9 0.0
0.4
0.8
1 2 3 4 5 6 7 8 9
k [kth visit on site] k [kth visit on site] k [kth visit on site] k [kth visit on site]Teleporting Backpaging
§ 48% sites visited at least 9 times § Revisitation “level” depends on site
§ 10% users accessed a site 9+ times (23% for search sites); 28% at least four times (44% for search sites)
§ Activity on site decreases with each revisit but activity on many search sites increases
Motivation, interests, incentives, and benefits
• Domain: Network of sites • Study: Downstream engagement
(Yom-Tov et al, 2013)
Success of a site depends on itself, but also on how it is reached from other sites.
Downstream user engagement: engagement across a network of sites Large online providers (AOL, Google, Yahoo!, MSN, etc.) offer not one service (site), but a network of sites
Provider sites
User session
Downstream engagement for site A
(% remaining session time)
Site A
Influential features (50 Yahoo sites, 250K+ users, 1.9M sessions)
o Time of day
o Number of (non-image/non-video) links to Yahoo! sites in HTML body o Average rank of Yahoo! links on page o Number of (non-image/non-video) links to non-Yahoo! sites in HTML body
o Number of span tags (tags that allow adding style to content or manipulating content, e.g. JavaScript)
o Link placements and number of Yahoo links can influence downstream engagement o Not new, but here shown to hold also across sites
o Links to non-Yahoo sites have a positive effect on downstream engagement o Possibly because when users are faced with abundance of outside links
they decide to focus their attention on a central content provider, rather than visiting multitude of external sites
Richness and control
• Domain: social media (Yahoo! Answers and Wikipedia) • Study: serendipity (in entity search)
(Bordino, Mejova & Lalmas, 2013)
Interesting search results may promote serendipitous browsing.
Yahoo! Answers vs Wikipedia community-driven question & answer portal § 67 336 144 questions & 261
770 047 answers § January 1, 2010 –
December 31, 2011 § English-language
community-driven encyclopedia • 3 795 865 articles • as of end of
December 2011 • English Wikipedia
curated high-quality knowledge variety of niche topics
minimally curated opinions, gossip, personal info
variety of points of view
Entity Search
we build an entity-driven serendipitous search system based on entity networks extracted from Wikipedia and Yahoo! Answers
Serendipity finding something good or useful while not specifically looking for it, serendipitous search systems provide relevant and interesting results
Wikipedia
Yahoo! Answers
retrieve entities most related to a query entity using random walk
• Annotator agreement (overlap): 0.85 • Average overlap in top 5 results: <1
| relevant & unexpected | / | unexpected | number of serendipitous results out of all of the unexpected results retrieved
| relevant & unexpected | / | retrieved | serendipitous out of all retrieved
Baseline Data Top: 5 en&&es that occur most frequently WP 0.63 (0.58) in top 5 search from Bing and Google YA 0.69 (0.63) Top –WP: same as above, but excluding WP 0.63 (0.58) Wikipedia page from results YA 0.70 (0.64) Rel: top 5 en&&es in the related query WP 0.64 (0.61) sugges&ons provided by Bing and Google YA 0.70 (0.65) Rel + Top: union of Top and Rel WP 0.61 (0.54) YA 0.68 (0.57)
Serendipity “making fortunate discoveries by accident”
Serendipity = unexpectedness + relevance “Expected” result baselines from web search
Interestingness ≠ Relevance Interesting > Relevant
Relevant > Interesting
Oil Spill à Penguins in Sweaters WP
Robert Pattinson à Water for Elephants WP
Lady Gaga à Britney Spears WP
Egypt à Cairo Conference WP
Netflix à Blu-ray Disc YA
Egypt à Ptolemaic Kingdom WP & YA
Novelty
Take-away message § Search is not just about specific information needs § People search for many other reasons
› Navigation › Transaction › Fun (ECIR 2012 workshop) › Etc.
§ Engagement in search is to view search activities as part of the current overall task of a user
§ We never know what we get if we are ready to explore › Users do things that no one expects, not even them!
(like staying inside Yahoo! in spite of having many links to go elsewhere) › So a link is not everything, for search too!
§ Summarizing, in search we need to look at user engagement in a broader way
Thank you Acknowledgements: Ricardo Baeza-Yates, Ilaria Bordino, George Dupret, Janette Lehmann, Yelena Mejova and Elad Yom-Tov.
Blog: labtomarket.wordpress.com
A first version of this talk was given by Ricardo Baeza-Yates, SIGIR 2013 Industry Day