Date post: | 24-Jan-2015 |
Category: |
Technology |
Upload: | marco-brambilla |
View: | 4,183 times |
Download: | 0 times |
Politecnico Di MilanoDipartimento Di Elettronica e Informazione
Answering Search Queries with CrowdSearcher
A crowdsourcing approach to search
Alessandro Bozzon, Marco Brambilla, Stefano Ceri
WWW 2012, Lyon, FranceApril, 20th 2012
Answering Search Queries with CrowdSearcher
Context
• Web is a huge, heterogeneous data source:
•Structured, unstructured and semi-structured data
•Known problems of trust, reputation, consistency
•User needs to solve real-life problems, not to find a web site
Answering Search Queries with CrowdSearcher
Context
Answering Search Queries with CrowdSearcher
Context
•User needs to solve real-life problems, not to find a web site
•Web queries get increasingly complex and specialized
•Exploratory search
•From document search to object search
• Search as a service
•Viability of systems based upon search service orchestration
Answering Search Queries with CrowdSearcher
Background: semantic multi-domain search
“… search for upcoming concerts close to an attractive location (like a beach, lake, mountain, natural park, and so on), considering also availability of good, close-by hotels …”
Answering Search Queries with CrowdSearcher
Background: semantic multi-domain search
“… expand the search to get information about available restaurants near the candidate concert locations, news associated to the event and possible options to combine further events …”
Answering Search Queries with CrowdSearcher
Liquid Query:Query Submission [WWW 2010]
Concert query
conditions
Hotelsquery
conditions
Example Scenario 1: Trip planner for events
Answering Search Queries with CrowdSearcher
Liquid Query:Query Execution
Answering Search Queries with CrowdSearcher
Liquid Query: alternative visualizations and domain-independent platform
Example Scenario 2: Scientific Publication search
Answering Search Queries with CrowdSearcher
Problem Statement
•When dealing with real-life problems, people do not trust the web completely
•Want to go back to discussion with people
•Expect insights, opinions, reassurance
Answering Search Queries with CrowdSearcher
Problem Statement
•When dealing with real-life problems, people do not trust the web completely
•Want to go back to discussion with people
•Expect insights, opinions, reassurance
Answering Search Queries with CrowdSearcher
Problem Statement
•When dealing with real-life problems, people do not trust the web completely
•Want to go back to discussion with people
•Expect insights, opinions, reassurance
•Our proposal
Interleaving and integration
of exploratory search
and social community input
Answering Search Queries with CrowdSearcher
Social Search: increasing quality in search
• From exploratory search to friends and experts feedback
Exploratory Search System
Human Search System
Initial query
Exploration step
Exploration step
System API Social API
Database / IR index
Crowd / Community
Answering Search Queries with CrowdSearcher
From crowds to communities – The problems
•Crowds vs. social networks
•Friends or workforce?
•Complex interleaving of factors. Including:
• Intensity of social activity of the asker
• Motivation of the responders
• Topic
• Information diffusion
• Timing of the post (hour of the day, day of the week)
• Context and language barrier
Answering Search Queries with CrowdSearcher
Task management problems
Typical crowdsourcing problems:
• Task splitting: the input data collection is too complex relative to the cognitive capabilities of users.
• Task structuring: the query is too complex or too critical to be executed in one shot.
• Task routing: a query can be distributed according to the values of some attribute of the collection.
Plus:
•Platform/community assignment: a task can be assigned to different communities or social platforms based on its focus
Answering Search Queries with CrowdSearcher
Social Search – query properties
• Invited community
• Engagement platform
• Execution platform
• Query type: Like, Add, Sort / Rank, Comment, Modify
• Visibility: public or private
• Diffusion: enabled or not
• Timespan
Answering Search Queries with CrowdSearcher
Deployment: search on the social network
• Multi-platform deployment
Embedded application
Social/ Crowd platformNative
behaviours
External application
Standalone application
API
Embedding
Community / Crowd
Generated query template
Native
Answering Search Queries with CrowdSearcher
Deployment: search on the social network
• Multi-platform deployment
Answering Search Queries with CrowdSearcher
Deployment: search on the social network
• Multi-platform deployment
Answering Search Queries with CrowdSearcher
Deployment: search on the social network
• Multi-platform deployment
Answering Search Queries with CrowdSearcher
Deployment: search on the social network
• Multi-platform deployment
Answering Search Queries with CrowdSearcher
Example: Find your next job (exploration)
Answering Search Queries with CrowdSearcher
Example: Find your job (social invitation)
Answering Search Queries with CrowdSearcher
Example: Find your job (social invitation)
Selected data items can be transferred to the crowd question
Answering Search Queries with CrowdSearcher
Find your job (response submission)
Answering Search Queries with CrowdSearcher
Experimental setting
• Some 150 users
• Two classes of experiments:
• Random questions on fixed topics: interests (e.g. restaurants in the vicinity of Politecnico), to famous 2011 songs, or to top-quality EU soccer teams
• Questions independently submitted by the users
• Different invitation strategies:
• Random invitation
• Explicit selection of responders by the asker
• Outcome
• 175 like and insert queries
• 1536 invitations to friends
• 95 questions (~55%) got at least one answer
• 230 collected answers
Answering Search Queries with CrowdSearcher
Experiments: Manual and random questions
Answering Search Queries with CrowdSearcher
Experiments: Interest and relationship
•Manually written and assigned questions are consistently more responded in time
Answering Search Queries with CrowdSearcher
Experiments: Query type
• Engagement depends on the difficulty of the task
• Like vs. Add tasks:
Answering Search Queries with CrowdSearcher
Experiments: Distribution of answers/invitation
• Sometimes: more answers than invitations (limited cases)
Answering Search Queries with CrowdSearcher
Experiment: Social platform
• The question enactment platform role
• Facebook vs. Doodle
Answering Search Queries with CrowdSearcher
Experiment: Social platform
• The question enactment platform role
• Facebook vs. Doodle
Answering Search Queries with CrowdSearcher
Experiment: Posting time
• The question enactment platform role
• Facebook vs. Doodle
Answering Search Queries with CrowdSearcher
Conclusions and future work
Status• the chances to get responses depend a lot on the consistency of the users’ community and on the mechanisms that are exploited for inviting the users and for collecting the responses
Future work• More experiments (e.g., vs. sociality of users, vs. crowds, …)
• Not only search: active integration of web structured data and social sensors
Some ads• Search Computing book series (Springer LNCS)
• Workshop Very Large Data Search at VLDB
• VLDB Journal special issue (deadline Sept 2012)