Search and Hyperlinking Task at MediaEval 2012
Search and Hyperlinking Taskat MediaEval 2012
Maria Eskevich1, Gareth J.F. Jones1, Shu Chen1
Robin Aly2, Roeland Ordelman2Martha Larson3
1 Dublin City University, Dublin, Ireland2 University of Twente, The Netherlands
3 Delft University of Technology, Delft, The Netherlands
Search and Hyperlinking Task at MediaEval 2012
Background
Search and Hyperlinking Task at MediaEval 2012
Brave New Task
Search and Hyperlinking Task at MediaEval 2012
Brave New Task and Crowdsourcing
Search and Hyperlinking Task at MediaEval 2012
Brave New Task: Search and Hyperlinking
Previous work:I MediaEval 2011: Rich Speech Retrieval TaskI VideoCLEF 2009: Linking Task
What is Brave New about the Search and Hyperlinking?I Unified scenario for 2 tasks:
Search output results as input for LinkingI Use of crowdsourcing for results assessment:
workers define the relevance of the video segments,the HIT can be run for each new submission andenrich overall results
I Search: ME10WWW dataset − > blip10000
Search and Hyperlinking Task at MediaEval 2012
Brave New Task: Search and Hyperlinking
Previous work:I MediaEval 2011: Rich Speech Retrieval TaskI VideoCLEF 2009: Linking Task
What is Brave New about the Search and Hyperlinking?I Unified scenario for 2 tasks:
Search output results as input for LinkingI Use of crowdsourcing for results assessment:
workers define the relevance of the video segments,the HIT can be run for each new submission andenrich overall results
I Search: ME10WWW dataset − > blip10000
Search and Hyperlinking Task at MediaEval 2012
Sub-tasks
I Search sub-task:I Known-item search:
I Textual queries onlyI Multimodal queries (text + video clues)
I Required runs:1 submission for 1-best output of each ASR transcript
I Linking sub-taskI Ad-hoc search for videos that can potentially be linked to
the anchor videos (ground truth of the Search subtask)I Required runs:
use of ASR 1-best transcripts as representationI *Additional runs:
use team output of Search subtask as video anchors forLinking sub-task.
Search and Hyperlinking Task at MediaEval 2012
Data: blip10000
I ASR Transcript:I LIMSI/Vocapia:
I Confusion networksI LIUM:
I 1-bestI LatticesI Confision networks
I Video clues:I Shot boundaries (TU Berlin)
I Concept-based descriptors based on a list of 589 concepts(University of Oxford)
I Face detection results (INRIA)
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
I Mean Reciprocal Rank (MRR):
RR =1
RANKI Mean Generalized Average Precision (mGAP):
GAP =1
RANK. PENALTY
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
I Mean Reciprocal Rank (MRR):
RR =1
RANKI Mean Generalized Average Precision (mGAP):
GAP =1
RANK. PENALTY
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
I Mean Average Segment Precision (MASP):Ranking + Length of (ir)relevant content
Segment Precision (SP[r ]) at rank r :
Average Segment Precision:
ASP =1n.
N∑r=1
SP[r ] · rel(sr )
rel(sr ) = 1, if relevant content is present,otherwise rel(sr ) = 0
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
I Mean Average Segment Precision (MASP):Ranking + Length of (ir)relevant content
Segment Precision (SP[r ]) at rank r :
Average Segment Precision:
ASP =1n.
N∑r=1
SP[r ] · rel(sr )
rel(sr ) = 1, if relevant content is present,otherwise rel(sr ) = 0
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
I Mean Average Segment Precision (MASP):Ranking + Length of (ir)relevant content
Segment Precision (SP[r ]) at rank r :
Average Segment Precision:
ASP =1n.
N∑r=1
SP[r ] · rel(sr )
rel(sr ) = 1, if relevant content is present,otherwise rel(sr ) = 0
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Search sub-task
I Mean Average Segment Precision (MASP):Ranking + Length of (ir)relevant content
Segment Precision (SP[r ]) at rank r :
Average Segment Precision:
ASP =1n.
N∑r=1
SP[r ] · rel(sr )
rel(sr ) = 1, if relevant content is present,otherwise rel(sr ) = 0
Search and Hyperlinking Task at MediaEval 2012
Relevance Evaluation via Crowdsourcing:Amazon MTurk HIT
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Linking sub-task
* Brave New Task issue: Late evaluation results :-(
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Linking sub-task
* Brave New Task issue: Late evaluation results :-(
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Linking sub-task
* Brave New Task issue: Late evaluation results :-(
Search and Hyperlinking Task at MediaEval 2012
Evaluation: Linking sub-task
* Brave New Task issue: Late evaluation results :-(
Search and Hyperlinking Task at MediaEval 2012
Relevance Evaluation via CrowdsourcingDetails
I Assessment level:I VideosI Provided segments:
Length varies across runs and participants
I Workers explanation of relevance judgement:I Relevant:
I ”Same” videosI Videos are on the same topicI Same program different topics
I Irrelevant:I Same program different topicsI Different topicsI Different video styles: interview vs presentation
Search and Hyperlinking Task at MediaEval 2012
Relevance Evaluation via CrowdsourcingDetails
I Assessment level:I VideosI Provided segments:
Length varies across runs and participants
I Workers explanation of relevance judgement:I Relevant:
I ”Same” videosI Videos are on the same topicI Same program different topics
I Irrelevant:I Same program different topicsI Different topicsI Different video styles: interview vs presentation
Search and Hyperlinking Task at MediaEval 2012
Participants
Group Search LinkingCharles University in Prague (CUNI) 5 –
Dublin City University (DCU) 4 6Ghent University (MMLab) 4 4 + 4
INRIA/IRISA – 5University of Twente (UTwente) 3 1
Search and Hyperlinking Task at MediaEval 2012
Participants
Group Search LinkingCharles University in Prague (CUNI) 5 –
Dublin City University (DCU) 4 6Ghent University (MMLab) 4 4 + 4
INRIA/IRISA – 5University of Twente (UTwente) 3 1
Search and Hyperlinking Task at MediaEval 2012
Search sub-task results
Search and Hyperlinking Task at MediaEval 2012
Linking sub-task results
Search and Hyperlinking Task at MediaEval 2012
Thank you for your attention!
Buon compleanno Robin!
Participants presentations:I Search sub-task:
I Maria Eskevich, Dublin City UniversityI Petra Galuscakova, Charles University in Prague
I Search and Linking sub-task:I Danish Nadeem, University of Twente
I Linking sub-task:I Camille Guinaudeau, INRIA/IRISAI Shu Chen, Dublin City University
I Search and Linking sub-task:I Tom de Nies, University of Chent
Search and Hyperlinking Task at MediaEval 2012
Thank you for your attention!Buon compleanno Robin!
Participants presentations:I Search sub-task:
I Maria Eskevich, Dublin City UniversityI Petra Galuscakova, Charles University in Prague
I Search and Linking sub-task:I Danish Nadeem, University of Twente
I Linking sub-task:I Camille Guinaudeau, INRIA/IRISAI Shu Chen, Dublin City University
I Search and Linking sub-task:I Tom de Nies, University of Chent