Post on 27-Mar-2015
transcript
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
A unIfied framework for multimodal content SEARCH
Short Project Overview
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
What is STREP I-SEARCH?
IST-STREP Project ID 248296, 2010-2012 The mission• To provide a novel unified framework for multimodal
content indexing, sharing, search and retrieval. • To handle specific types of multimedia and multimodal
content (text, 2D image, sketch, video, 3D objects and audio) along with real world and emotional information.
• To develop a highly user-centric search-engine (only the content of interest will be delivered to the end-users).
• To optimally present the retrieved results to the user via novel visualization schemes.
• To dynamically adapt the search engine to end-user’s device (varying from a simple mobile phone to a high-performance PC).
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
What is STREP I-SEARCH?
Objectives
• Research and development of an innovative Rich Unified Content Description (RUCoD).
• Development of intelligent content interaction mechanisms so that only the content of interest will be delivered to the users.
• Provide a novel way for presentation of the multimodal data retrieved by the search engine, by utilizing visual analytics technologies.
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
What is STREP I-SEARCH?
Main ObjectiveThe aim of the I-SEARCH project is the
development of the first search engine able to handle multimedia and multimodal content (text, 2D image, sketch, video, 3D objects, audio and combination of the above), which can be used as queries and retrieve any available relevant content of any of the aforementioned types.
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
What is STREP I-SEARCH?
Objective 1Research and development of an innovative Rich Unified Content Description
(RUCoD). RUCoD: a multi-layered structure (from low-level to high-level descriptors) that
integrates the following types of information: Geometrical, Topological, Temporal, Multisensory Multimodal interaction information Meta-tags connected with the intrinsic properties of the content (static
features such as shape, colour, texture, dimension, etc.), Dynamic properties (temporal descriptors, how it behaves, in which activities it is
normally used, who uses it, etc.), Non-verbal expressive and emotional descriptors, Social descriptors (how content is related to users, social/collaborative use of the
content), content descriptors as for the behaviour of the humans as for how content is intended to be elaborated and manipulated, individually
or socially. Real-world data descriptors
Development of novel multimodal annotation propagation algorithms
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
What is STREP I-SEARCH?
Objective 2Development of intelligent content interaction
mechanisms so that only the content of interest will be delivered to the users. Natural, expressive and multimodal interfaces. Personal and social-based relevance (incl.
recommendation-based) feedback mechanisms. Social and collaborative behaviour of users will
be exploited at best Users better express what they want to retrieve.
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
What is STREP I-SEARCH?
Objective 3Provide a novel way for presentation of the
multimodal data retrieved by the search engine, using Visual Analytics technologies
Advanced reasoning algorithms to drive the visualisation of the I-SEARCH RuCoD compliant content.
Visual Analytics provide presentation of the search results in the optimal way
Users find the result that optimally match their query in a fast and efficient way.
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
I-SEARCH Innovation
Innovation Novel Rich Unified Content Description (RUCoD) that integrates the following features:
• Intrinsic properties (low-level features) of the multimedia content types. Supported content (text, 2D image, sketch, video, 3D objects, audio and combination of the above) as well as several properties of each type (e.g. shape, colour, texture, dimension
• Non-verbal expressive and emotional descriptors• Social descriptors
Enrichment of RUCoD with additional contextual information Novel multimodal annotation propagation mechanism based on RUCoD Novel multimodal interaction mechanisms
• That can capture the emotional expressive and social information conveyed by both individual and groups of expert and non-expert users
Novel RUCoD data representations and transformations in order to support conversion of all types of conflicting and dynamic data in ways that support visualization and analysis.
Novel techniques for the analysis and presentation of retrieved multimedia content utilizing Visual Analytics technologies
Novel business cases clearly illustrating the contribution of I-SEARCH
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
I-Search Use Cases
Use Case 1: Search for music content Users search the ANSC ethnomusicological multimedia digital archives
utilising the novel multimodal search and retrieval framework proposed by I-SEARCH.
Traditional search of music archives usually include descriptive methods (mainly textual: for example author, title, etc.).
In this scenario, several querying modalities will be utilised, such as music, image or video, as well as user-related information including gesture and emotional cues.
The basic idea is to use more “natural” and expressive interfaces for narrowing and adapting the searching experience to what the user wants.
In order to realise multimodal search and retrieval, A RUCoD representation of the ANSC content will be available.
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
I-SEARCH Use Cases
Use Case 2: Search and retrieval of 3D furniture models/game models In the area of furniture data modelling, content is available in
explicit catalogues comprising 3D (and 2D) data models. Using I-SEARCH user will be to perform content-based search
using as input relevant content (photos, 3D models, videos). For all the available multimedia content stored in furniture
catalogues, the RUCoD description will be generated and stored along with the content. This uniform content description will enable efficient search and retrieval of relevant content irrespective of the query input type.
Additional input, related to real-world information, such as location of the user, can be also provided.
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
I-SEARCH Use Cases
Use Case 3: Search and Retrieval of 3D Game Models Customization of 3D video games both on home entertainment systems (such as
personal computers and video game consoles) and on emerging mobile entertainment platforms (like mobile phones, handheld video games and PDAs)
Retrieval and the exchange of game avatars, objects, levels and in general any multimedia object that could be added in a game.
The user can use as queries: 3D objects, such as avatars, vehicles, scenes or other artefacts 2D photos (in the sense of snapshots) or videos of items or avatars of online
players The retrieved results may include:
Other 3D objects, avatars, vehicles, scenes Advanced non-geometric information such as commercial information and
related products
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
I-SEARCH Architecture
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
RuCoD Specifications
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
Multimodal Annotation Propagation
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
Multimodal Interaction
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
Visual Analytics Presentation
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
I-SEARCH Activities
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
Potential technical interactions between Projects
I-Search fits mainly to the A/V search cluster and could collaborate with the following projects:
• P2P-Next– A common issue for exploitation between P2P-Next and I-Search is the
end user experience on various terminals.• VirtualLife
– VirtualLife can provide 3D material to I-Search.• PetaMedia
– The focus of PetaMedia on multimedia content analysis and social peer-to-peer networks can be useful for I-Search for developing both the low level content descriptors as well as in defining the social descriptors.
• COAST– I-Search could use the CCN approach which will be developed in
COAST so as to test the search efficiency in these networks.
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
Events of Joint Interest
• FIA Valencia, Search Break Out Session, 14-15 April 2010, Valencia, Spain
• UCMedia 2010, 1-3 September 2010, Palma de Mallorca, Spain
• SHREC 2010, http://www.aimatshape.net/event/SHREC,
• 3rd Eurographics Workshop on 3D Object Retrieval, 2 May 2010, Norrköping, Sweden, co-event of EuroGraphics 2010
• FIA Ghent, Search Break Out Session, 16-17 December 2010, Ghent, Belgium
5th FP7 Networked Media Concertation Meeting, Brussels3-4 February 2010
I-SEARCH: Overview
http://www.isearch-project.eu
Thank you for your attention!