Implicit User Modeling for Personalized Search

Post on 31-Dec-2015

57 views 1 download

Tags:

description

Implicit User Modeling for Personalized Search. Xuehua Shen, Bin Tan, ChengXiang Zhai Department of Computer Science University of Illinois, Urbana-Champaign. Current Search Engines are Mostly Document-Centered…. …. Search Engine. …. Documents. Search is generally non-personalized…. - PowerPoint PPT Presentation

transcript

Implicit User Modeling for

Personalized Search

Xuehua Shen, Bin Tan, ChengXiang Zhai

Department of Computer Science

University of Illinois, Urbana-Champaign

2

Current Search Engines are Mostly Document-Centered…

Documents

Search Engine

...

Search is generally non-personalized…

……

3

Example of Non-Personalized Search

As of Oct. 17, 2005

Query = Jaguar

Car

Car

Car

Car

Software

Animal

Without knowing more about the user, it’s hard to optimize…

Therefore, personalization is necessary to improve the existing search engines.

However, many questions need to be answered…

5

Research Questions

• Client-side or server-side personalization?

• Implicit or explicit user modeling?

• What’s a good retrieval framework for personalized search?

• How to evaluate personalized search?

• …

6

Client-Side vs. Server-Side Personalization

• So far, personalization has mostly been done on the server side

• We emphasize client-side personalization, which has 3 advantages:

– More information about the user, thus more accurate user modeling (complete interaction history + other user activities)

– More scalable (“distributed personalization”)

– Alleviate the problem of privacy

7

Implicit vs. Explicit User Modeling

• Explicit user modeling

– More accurate, but users generally don’t want to provide additional information

– E.g., relevance feedback

• Implicit user modeling

– Less accurate, but no extra effort for users

– E.g., implicit feedback

We emphasize implicit user modeling

8

“Jaguar” Example RevisitedSuppose we know:

1. Previous query = “racing cars”

2. “car” occurs far more frequently than “Apple” in pages browsed by the user in the last 20 days

3. User just viewed an “Apple OS” document

All the information is naturally available to an IR system

9

Remaining Research Questions

• Client-side or server-side personalization?

• Implicit or explicit user modeling?

• What’s a good retrieval framework for personalized search?

• How to evaluate personalized search?

• …

10

Outline

• A decision-theoretic framework

• UCAIR personalized search agent

• Evaluation of UCAIR

Implicit user information exists in the user’s interaction history.

We thus need to develop a retrieval framework for interactive retrieval…

12

Modeling Interactive IR

• Model interactive IR as “action dialog”: cycles of user action (Ai ) and system response (Ri )

User action (Ai ) System response (Ri )

Submit a new query Retrieve new documents

View a document Present selected document

Rerank unseen documents

13

Retrieval Decisions

User U: A1 A2 … … At-1 At

System: R1 R2 … … Rt-1

Given U, C, At , and H, choosethe best Rt from all possible

responses to At

History H={(Ai,Ri)} i=1, …, t-1

DocumentCollection

C

Query=“Jaguar”

All possible rankings of C

Best ranking for the query

Click on “Next” button

All possible rankings of unseen docs

Best ranking of unseen docs

Rt r(At)

Rt =?

14

Decision Theoretic Framework

User: U Interaction history: HCurrent user action: At

Document collection: C

Observed

All possible responses: r(At)={r1, …, rn}

User Model

M=(S, U…) Seen docs

Information need

L(ri,At,M) Loss Function

Optimal response: Rt (minimum loss)

( )argmin ( , , ) ( | , , , )tt r r A t tM

R L r A M P M U H A C dM ObservedInferredexpected risk

15

• Approximate the expected risk by the loss at the mode of the posterior distribution

• Two-step procedure

– Step 1: Compute an updated user model M* based on the currently available information

– Step 2: Given M*, choose a response to minimize the loss function

A Simplified Two-Step Decision-Making Procedure

( )

( )

argmin ( , , ) ( | , , , )

argmin ( , , *)

* argmax ( | , , , )

t

t

t r r A t tM

r r A t

M t

R L r A M P M U H A C dM

L r A M

where M P M U H A C

16

Optimal Interactive Retrieval

User

A1

U C

M*1P(M1|U,H,A1,C)

L(r,A1,M*1)

R1A2

L(r,A2,M*2)

R2

M*2P(M2|U,H,A2,C)

A3 …

Collection

IR system

17

Refinement of Decision Theoretic Framework

• r(At): decision space (At dependent)

– r(At) = all possible rankings of docs in C

– r(At) = all possible rankings of unseen docs

• M: user model

– Essential component: U = user information need

– S = seen documents

• L(ri,At,M): loss function

– Generally measures the utility of ri for a user modeled as M

• P(M|U, H, At, C): user model inference

– Often involves estimating U

18

Case 1: Non-Personalized Retrieval

– At=“enter a query Q”

– r(At) = all possible rankings of docs in C

– M= U, unigram language model (word distribution)

– p(M|U,H,At,C) = p(U |Q)

1

1

1 2

( , , ) (( ,..., ), )

( | ) ( || )

( | ) ( | ) ....

( || )

i

i

i t N U

N

i U di

t U d

L r A M L d d

p viewed d D

Since p viewed d p viewed d

the optimal ranking R is given by ranking documents by D

19

Case 2: Implicit Feedback for Retrieval

– At=“enter a query Q”

– r(At) = all possible rankings of docs in C

– M= U, unigram language model (word distribution)

– H={previous queries} + {viewed snippets}

– p(M|U,H,At,C) = p(U |Q,H)1

1

1 2

( , , ) (( ,..., ), )

( | ) ( || )

( | ) ( | ) ....

( || )

i

i

i t N U

N

i U di

t U d

L r A M L d d

p viewed d D

Since p viewed d p viewed d

the optimal ranking R is given by ranking documents by D

Implicit User Modeling

20

Case 3: More General Personalized Search with Implicit Feedback

– At=“enter a query Q” or “Back” button, “Next” link

– r(At) = all possible rankings of unseen docs in C

– M= (U, S), S= seen documents

– H={previous queries} + {viewed snippets}

– p(M|U,H,At,C) = p(U |Q,H)1

1

1 2

( , , ) (( ,..., ), )

( | ) ( || )

( | ) ( | ) ....

( || )

i

i

i t N U

N

i U di

t U d

L r A M L d d

p viewed d D

Since p viewed d p viewed d

the optimal ranking R is given by ranking documents by D

Eager Feedback

21

Benefit of the Framework

• Traditional view of IR

– Retrieval Match a query against documents

– Insufficient for modeling personalized search (user and the interaction history are not part of a retrieval model)

• The new framework provides a map for systematic exploration of

– Methods for implicit user modeling

– Models for eager feedback

• The framework also provides guidance on how to design a personalized search agent (optimizing responses to every user action)

The UCAIR Toolbar

23

UCAIR Toolbar Architecture(http://sifaka.cs.uiuc.edu/ir/ucair/download.html)

Search Engine(e.g.,

Google)Search History

Log (e.g.,past queries,

clicked results)

Query Modification

ResultRe-Ranking

UserModeling

Result Buffer

UCAIR User query

results

clickthrough…

24

Decision-Theoretic View of UCAIR

• User actions modeled

– A1 = Submit a keyword query

– A2 = Click the “Back” button

– A3 = Click the “Next” link

• System responses

– r(Ai) = rankings of the unseen documents

• History– H = {previous queries, clickthroughs}

• User model: M=(X,S) – X = vector representation of the user’s information need

– S = seen documents by the user

25

Decision-Theoretic View of UCAIR (cont.)

• Loss functions:

– L(r, A2, M)= L(r, A3, M) reranking, vector space model

– L(r,A1,M) L(q,A1,M) query expansion, favor a good q

• Implicit user model inference

– X* = argmaxx p(x|Q,H), computed using Rocchio feedback

– S* = all seens docs in H

1

1,

( , , ) (( ,..., ), )

( | ) ( , )i

i t N

N

i ii d S

L r A M L d d X

p viewed d sim X d

1

1

(1 )k

iki

x q s

2222222222222 2

Vector of a seen snippet

Newer versions of UCAIR have adopted language models

26

UCAIR in Action

• In responding to a query

– Decide relationship of the current query with the previous query (based on result similarity)

– Possibly do query expansion using the previous query and results

– Return a ranked list of documents using the (expanded) query

• In responding to a click on “Next” or “Back”

– Compute an updated user model based on clickthroughs (using Rocchio)

– Rerank unseen documents (using a vector space model)

27

Screenshot for Result Reranking

28

A User Study of Personalized Search

• Six participants use UCAIR toolbar to do web search

• Topics are selected from TREC web track and terabyte track

• Participants explicitly evaluate the relevance of top 30 search results from Google and UCAIR

29

UCAIR Outperforms Google: Precision at N Docs

Ranking Method

prec@5 prec@10 prec@20 prec@30

Google 0.538 0.472 0.377 0.308

UCAIR 0.581 0.556 0.453 0.375

Improvement 8.0% 17.8% 20.2% 21.8%

More user interactions better user models better retrieval accuracy

30

UCAIR Outperforms Google: PR Curve

31

Summary

• Propose a decision theoretic framework to model interactive IR

• Build a personalized search agent for the web search

• Do a user study of web search and show that UCAIR personalized search agent can improve retrieval accuracy

32

Thank you !

The End