+ All Categories
Home > Documents > 2009/25  · 10 TB storage visualization cluster Interaction and Collaboration WILD focuses on...

2009/25  · 10 TB storage visualization cluster Interaction and Collaboration WILD focuses on...

Date post: 02-Jun-2020
Category:
Upload: others
View: 12 times
Download: 0 times
Share this document with a friend
1
2009/25 www.digiteo.fr Emmanuel Pietriga & Michel Beaudouin-Lafon WILD: Wall-sized Interaction with Large Data sets ABSTRACT WILD is an experimental high-resolution, interactive platform for conducting research on collaborative human-computer interaction and the visualization of large data sets. The platform is accessible to scientists from other disci- plines, including astrophysicists, biologists, chemists, as well as computer scientists, to visualize, explore and analyze their data. The platform was offi- cially opened on June 19 2009. Partners In Situ (Situated Interaction), joint team between INRIA and LRI (UMR CNRS - Universit´ e Paris-Sud); Aviz (Visual Analytics), INRIA; AMI (Architectures and Models for Interaction), LIMSI (UPR CNRS as- sociated with Universit´ e Paris-Sud). Collaborating laboratories: Institut d’Astrophysique Spatiale (IAS), Insti- tut de Biochimie et Biophysique Mol´ eculaire et Cellulaire (IBBMC), Insti- tut de Chimie Mol´ eculaire et des Mat´ eriaux d’Orsay (ICMMO), Institut de en´ etique et Microbiologie (IGM), Laboratoire de l’Acc´ el´ erateur Lin´ eaire (LAL), Laboratoire d’Informatique pour la M´ ecanique et les Sciences de l’Ing´ enieur (LIMSI), Laboratoire de Neuroimagerie Assist´ ee par Ordinateur (LNAO), Laboratoire de Math´ ematiques Appliqu´ ees aux Syst` emes (MAS). Technical Details 32 LCDs 20480 x 6400 131 Mpixels 5m50 x 1m80 wall-sized tiled display multi-touch tabletop display tracking up to 25 contact points simultaneously 3D motion tracking real-time, with sub-millimiter precision 16 nodes 2 front-ends Gigabit network 36 graphics cards 144 CPU cores 192 GB RAM 10 TB storage visualization cluster Interaction and Collaboration WILD focuses on interaction, providing users with a 3D real-time motion capture system, multi-touch tabletop displays and other devices. Unlike other wall-sized displays, users will be able to interact and collaborate di- rectly, to both visualize and manipulate heterogeneous data sets. Multi-scale Interaction lets users navigate through large and complex data sets by visualizing them at different scales. The high-resolution wall affords multi-scale interaction through simple locomotion: approaching the wall reveals details. Motion tracking will enable us to design new visualiza- tion and navigation techniques that use full-body motion to control scale. Multi-surface Interaction manages data displayed on multiple sur- faces such as the wall, tabletop display, and mobile devices (PDAs, iPod Touch, mobile phones, etc.). A key issue is to provide efficient techniques to help users transfer information seamlessly from one surface to another. The motion tracking system will offer a unique opportunity to investigate new multi-surface interaction techniques. Multi-user Interaction supports users collaborating to achieve a task, users interacting simultaneously on the same data set, and the exchange of data among users. WILD will focus on collaborative interactions involving multiple display and input surfaces. Typical situations include two users working on the same data set, one sitting at the table with a global view of the wall display, the other standing closer to the wall, getting detailed infor- mation about a region of the screen. Participatory Design Our research method is based on involving end users, such as astrophysi- cists and biologists, throughout the design process. Together, we will design the collaborative interaction and visualization techniques that will support their activities: We will analyze their needs and create early prototypes; We will observe their use of the prototypes and collect their ideas for improve- ment; We will conduct controlled experiments and longitudinal studies; We will refine the prototypes. In the end, we will have designed and validated techniques that better suit the needs of scientists in various disciplines based on real usage scenarios. Contact: Emmanuel Pietriga [email protected] · Michel Beaudouin-Lafon [email protected]
Transcript
Page 1: 2009/25  · 10 TB storage visualization cluster Interaction and Collaboration WILD focuses on interaction, providing users with a 3D real-time motion capture system, multi-touch tabletop

2009/25www.digiteo.fr

Emmanuel Pietriga & Michel Beaudouin-Lafon

WILD: Wall-sized Interaction with Large Data sets

ABSTRACT

WILD is an experimental high-resolution, interactive platform for conductingresearch on collaborative human-computer interaction and the visualizationof large data sets. The platform is accessible to scientists from other disci-plines, including astrophysicists, biologists, chemists, as well as computerscientists, to visualize, explore and analyze their data. The platform was offi-cially opened on June 19 2009.

Partners

• In Situ (Situated Interaction), joint team between INRIA and LRI (UMRCNRS - Universite Paris-Sud);

• Aviz (Visual Analytics), INRIA;

• AMI (Architectures and Models for Interaction), LIMSI (UPR CNRS as-sociated with Universite Paris-Sud).

Collaborating laboratories: Institut d’Astrophysique Spatiale (IAS), Insti-tut de Biochimie et Biophysique Moleculaire et Cellulaire (IBBMC), Insti-tut de Chimie Moleculaire et des Materiaux d’Orsay (ICMMO), Institut deGenetique et Microbiologie (IGM), Laboratoire de l’Accelerateur Lineaire(LAL), Laboratoire d’Informatique pour la Mecanique et les Sciences del’Ingenieur (LIMSI), Laboratoire de Neuroimagerie Assistee par Ordinateur(LNAO), Laboratoire de Mathematiques Appliquees aux Systemes (MAS).

Technical Details

32 LCDs

20480 x 6400

131 Mpixels

5m50 x 1m80

wall-sized

tiled display

multi-touch

tabletop display

tracking up to

25 contact points

simultaneously

3D motion

tracking

real-time,

with sub-millimiter

precision

16 nodes

2 front-ends

Gigabit network

36 graphics cards

144 CPU cores

192 GB RAM

10 TB storage

visualization

cluster

Interaction and Collaboration

WILD focuses on interaction, providing users with a 3D real-time motioncapture system, multi-touch tabletop displays and other devices. Unlikeother wall-sized displays, users will be able to interact and collaborate di-rectly, to both visualize and manipulate heterogeneous data sets.

Multi-scale Interaction lets users navigate through large and complexdata sets by visualizing them at different scales. The high-resolution wallaffords multi-scale interaction through simple locomotion: approaching thewall reveals details. Motion tracking will enable us to design new visualiza-tion and navigation techniques that use full-body motion to control scale.

Multi-surface Interaction manages data displayed on multiple sur-faces such as the wall, tabletop display, and mobile devices (PDAs, iPodTouch, mobile phones, etc.). A key issue is to provide efficient techniques tohelp users transfer information seamlessly from one surface to another. Themotion tracking system will offer a unique opportunity to investigate newmulti-surface interaction techniques.

Multi-user Interaction supports users collaborating to achieve a task,users interacting simultaneously on the same data set, and the exchange ofdata among users. WILD will focus on collaborative interactions involving

multiple display and input surfaces. Typical situations include two usersworking on the same data set, one sitting at the table with a global view ofthe wall display, the other standing closer to the wall, getting detailed infor-mation about a region of the screen.

Participatory Design

Our research method is based on involving end users, such as astrophysi-cists and biologists, throughout the design process. Together, we will designthe collaborative interaction and visualization techniques that will supporttheir activities: We will analyze their needs and create early prototypes; Wewill observe their use of the prototypes and collect their ideas for improve-ment; We will conduct controlled experiments and longitudinal studies; Wewill refine the prototypes. In the end, we will have designed and validatedtechniques that better suit the needs of scientists in various disciplines basedon real usage scenarios.

Contact: Emmanuel Pietriga [email protected] · Michel Beaudouin-Lafon [email protected]

Recommended