electronic visualization laboratory, university of illinois at chicago
Towards Lifelike Interfaces That Learn
Jason Leigh, Andrew Johnson, Luc Renambot, Steve Jones,
Maxine Brown
electronic visualization laboratory, university of illinois at chicago
The Electronic Visualization Laboratory
• Established in 1973• Jason Leigh, Director; Tom DeFanti, Co-
Director; Dan Sandin, Director Emeritus• 10 full time staff• Interdisciplinary Computer Science, Art &
Communication• 30 students, 15 funded students,• Research in:
– Advanced display systems– Visualization and virtual reality– High speed networking– Collaboration & human computer interaction
• 34 years of collaboration with Science, Industry & Arts to apply new computer science techniques to these disciplines.
• Major support by NSF and ONR.
electronic visualization laboratory, university of illinois at chicago
Goal in 3 Years
• Life-sized Avatar capable of reacting to speech input with naturalistic facial and gestural responses.
• A methodology of how to capture and translate human verbal and non-verbal communication into an interactive digital representation.
• Deeper understanding of how to create believable/credible avatars.
electronic visualization laboratory, university of illinois at chicago
System Components
AlexDSSSpeech Recognition
Natural Language Processing
ResponsiveAvatar
Facial Expression Recognition Responsive
Avatar EngineEye-tracking
Speech SynthesisLip SynchGestural ArticulationFacial Articulation
Knowledge
Processing
Facial & Body Motion / Performance Capture
Phonetic Speech Sampling
Knowledge Capture
Textual & Contextual Information
electronic visualization laboratory, university of illinois at chicago
EVL Year 1
• Digitize facial images and audio of Alex• Shadow Alex to capture information
about his mannerisms• Create 3D Alex focusing largely on
facial features• Prototype initial RAE & merge initial
avatar, speech recognition, AlexDSS, pre-recorded voices
• Validate provision of non-verbal avatar cues, evaluate efficacy of cues
electronic visualization laboratory, university of illinois at chicago
EVL Year 2
• Full-scale Motion & performance capture to create gestural responses to AlexDSS
• Speech synthesis using Alex’s voice patterns to create verbal responses to AlexDSS
• Use eye-tracking to begin to experiment with aspects of non-verbal communication
• Evaluate merging of verbal and non-verbal information in users’ understandings of– avatar believability and credibility
(ethos)– information retrieved– avatar emotional appeals (pathos)
electronic visualization laboratory, university of illinois at chicago
EVL Year 3
• Utilize camera-based recognition of facial expressions as additional non-verbal input
• Conduct user studies:– relative to a believability and
credibility (ethos)– to correlate attention to non-
verbal communication relative to comprehension and retention
– to assess value of avatar emotional appeals (pathos)
– to address formation of longer-term relationship formation between avatar and user.
CameraMicrophone
Life-sized projection
electronic visualization laboratory, university of illinois at chicago
Thanks!
• This project was supported by grants from the National Science Foundation
• Award CNS 0703916 and CNS 0420477