+ All Categories
Home > Documents > Building Semi-Immersing Human-Computer Interaction Environment of VirtualTeleconferedE .i Ng Based...

Building Semi-Immersing Human-Computer Interaction Environment of VirtualTeleconferedE .i Ng Based...

Date post: 06-Apr-2018
Category:
Upload: john-smith
View: 217 times
Download: 0 times
Share this document with a friend

of 5

Transcript
  • 8/3/2019 Building Semi-Immersing Human-Computer Interaction Environment of VirtualTeleconferedE .i Ng Based on Multi-Scr

    1/5

    Building Semi-Immersing Hum an-Computer Interaction Environment of VirtualTeleconferedEing. Based on Multi-Screen General Projection

    .Sun Lifeng , Zhong fiizhuo , 2rang SingDept of Computer Scien ce and T echnology, Tsinghda University, Beijin g, 100084,Ch ina

    Dept of System Engineering and Mathematics, National University of Defense Technology, Changsh a, 410073,China1

    2

    Absfrucf Building a humanamputer interactionenvironment which suppo rts natural comm unication andinteraction among participant group is the key issue dvirtual teleconferencing system , which needs COreestablish the correct space relationship of participantsand synthesizing real size video into the virtualenvironment. In this paper, we introduce a method thatuses multiple projection screens driven bynetwork-connected PCs to construct the spatial andrealism semi-immerse virtual teleconferencingenvironment based on general projection transformation .The experiment sh ows that participants can interact andcollaborate with natural interaction model in there.Key w o r d human-zomputer . Interaction, virtualteleconferencing, projection, vittual reality

    I. IntroductionTo implement a collaborative working environment for

    resolving complex problems such as military strategythat involves m ultiple research fields b y sup port ingnatural interaction and collaboration of people fromdiMerent research fields, the conception of VirtualTeleconferencing System [1,2,3] is put fonvard theseyears. In one side, a common space coordinating isestablished in the virtual conferencing space, whichdefends the space scope, describes the scene andconstructs a realism virtual meeting place. In another+ Supportedby NSF No.69973025* Corresponding author, Email: sunlfi~niail .tsi ,Irhua.edu.cn

    0-7903-7547-51021917.00 0 2 0 0 2 IEEE

    side, all the participants have th eir own spatial properties,and join the virtual conferencing application with awalkthrough method, they can interact with each otherusing natural model such as body language, eye contact.Recur to the VR technology, participants can coo peratevirtual entities or exp erience share virtual environment toprocess their group co llaboration.

    The main issue of virtual telecon ferencings).stem is tobuild the human-com puter interac tion environmcnt. Thecharactcrs and requirements of natural interactionsynthesizing are described as the follows:

    SpatialThe natural interaction such as body language, eyc

    contact, gaze awareness, depend on gven spacecoordinating reference system For example, threegeographically separated conference participants are A, B,C respectively. If A is gazing B and C is gazing A, B cansee As face and Cs profile (see Figure I) . C can find thefact that A is gazing B. The p recondition of realizing thismulti-party interaction is that A, B and C are in the samecoordinating space. The o bject of space synthesizing is toestablish right sp ace relationship among participants invirtual conference space, help participants gain thelocation and direction of othen.

    O AY

    Figure 1 the influence ofspace positionRealism

    In such a co llaborative seminar work hall, the

    639

  • 8/3/2019 Building Semi-Immersing Human-Computer Interaction Environment of VirtualTeleconferedE .i Ng Based on Multi-Scr

    2/5

    communicatjon imon g participants is the key method ofunderstanding collaboration [4], pa hip an ts want to getthe effect of communication as they experience in realworld. T his requires life size video of participants, whichpresent their expression and behaviors, must beintegrated into virtual conferencing space.

    e Real TimeTo support the normal communication and interaction,

    the synthesizing of video and audio of differentpanicipants must match the real time need of media. Fo rexample, people can tolerance 0.1s delay for audio,maximum delay is 0.2s, and othmise they will feeluncomfortable while talking.

    We design and implement a human-computerinteraction environment for our virtual teleconferencingprototype which using multiple screens driven bynetwork-connected PCs to support the thee issuesdescribed above.

    II. Virtual Conference Space ModelConstructing the virtual environment is the basic ofimplementing VST, Virtual Conference Space is thevirtual environment shared by all participants, it definesthe space reference coordinate for participants' behaviorand interaction, and controls the space awareness andinteraction, provides snvice and resource for supportingcollaborative work. The VCS can be present by aquaternion:

    VCS ::= where,- 0 represents the space stmcture and organization,- M represents the management of space awareness

    and interaction,- S represents the service for supporting collaborative

    work, such as group communication, role management,concurrent control, etc,

    - R represents the resource set, such as data,application tools.

    The space model of VCS is composed of two parts:global conference space and local conference space.Global Conference Space is the space description of the

    conference environment, including the distribution ofparticipants and scene objects. Local conference space isthe conference space which individual participantinteracts with, it includes two pa rts: virtual space and realspace. The virtual space is part of the 3D globalconfere nce space viewing from the participant's spaceposition and orient in the sce ne graph. Th e real sp ace isthe local conference room where the participant join s theconference. Figure 2 gives a demonstration of virtualconferencing space from the participant viewpo int.

    Figure2 a demonstration of v i m a l conferencing spaceWe design our global conference space model

    combining graphic and image technology The wholeGCS is a 3D space, we maintains our global conferencespace in a hierarchical stmcture called scene graph. Thescene graph is the stmcture that holds all of the elementsof the scene, such as geometly, lights, participant objects,and positional infomation, in a form of a directedhierarchical acyclic graph. To increase the complexityand realism of our conference space, we add real imagebased space model into the scene graph, which w e calledpanorama environment. [SI, the panorama environmentreflects the 360" view image of the real conference space,as show in F iy r e 3. We take the panorama as thebackground of ou r virtual conference space, thusparticipants can feel they are in the real conference room.

    Figure 3 the panorama ofmeeting raom

    III. VCS Construct and DisplayA.VCS ConstructWe d e s i g a semi-inmersing display environment to help

    640

  • 8/3/2019 Building Semi-Immersing Human-Computer Interaction Environment of VirtualTeleconferedE .i Ng Based on Multi-Scr

    3/5

    participants building the space relations 'in realconference using three projection screens in front of thep a h i p a n t s how n as Figure 4.

    R e d S n a c e B

    Figure4 HC I environment constructionThis interaction environment can not ody satisfy the

    view angle of human to let him feel spatial and realismbut also supp olt multiple users in the same environment.B. VCS Display Projectionspace according to their different positions and directions .To get the correct projection, it must follow the threesteps:

    Participants see the projection from global space to local

    1. Vmual cam era trackingEach participant has his own viltual camera, which

    repres ents his watching of total spa ce according to hisI position and direction.

    2. Space clippingAccording to participant's virtual camera, clip the

    Scene graph to determ ine the space scope to be projected.3 . Projection transformationTransform the visible space scope from global

    coordinates to local space coordinates and rendering'todisplay.To represent the space relatio nship of scene correctly,we deploy the perspective projection as the projectiontransformation for the multi-screen based virtual display

    environment. Because each screen comesponds thesame viewpoint, the simplest method for projection isjust rotatin g the viewpoint of each screen a given ang le todisplay the scene of different direction at the sameposition. But this method will cause the space aberrationshown as Figure 5

    Figure 5 space aberration caused by viewpoint rotationThe reason of the space aberration shown above is

    because the viewpoint rotation would cause the rotationof projection plane, thus result in the discontinuous at theedge of different projection plane. To resolve thisproblem, we introduce a general projection based methodformulti-screen environment mosa ic.

    First, we introduce the conceptio n of virtual viewpoint,which represent the participant's position and direction.The view range of virtual viewpoint is the sum of eachscreen view range, shown as Figure 4.

    Figure 6 multi-screen virtual en v i ro imen i COnStNCtionThe projection reference point of view M is at z-axis,

    the projection reference points of view L and R are not atz-axis, this is called general projection transformation.L,M and R construct the view of virtual viewpoint together,determine the scene of vittual conferencing space w hichcan be se en by the participant.

    We 'use he pinhole model as the camera projectionmodel. For the middle view plan e, hypoth esize theprojection reference point is at zpm of z-axis, view planeis at z.p, shown asFigure 7.

    641

  • 8/3/2019 Building Semi-Immersing Human-Computer Interaction Environment of VirtualTeleconferedE .i Ng Based on Multi-Scr

    4/5

    t

    view IPleneFigure 7 perspective projection of point

    The projection equation is:

    'whe re , dp = 7p.p - z . p , is the distance betweenprojection reference point and view plane, -u si ngthreedim ension homogeneous coordinates represent, theequation above can bepresented as:

    where, the homogeneous factor is:The projection 'coordinates on view plane calculated

    xp = q / h Y, =Y& (2)h - z m - z (31d

    with homogeneous coordinates are:

    For the left .an d right view planes, we take the( 1 ) shearing the view volume to let the center linefollowing wo

    steps to achieve the projection matrix:vertical the view plane.

    (2 ) scaling view volume using l/z scale factor.The operation to the general projection view volume

    with the projection window is illustrated as Figure 8,which transforms all the points at center line include thecenter point of the window to the lin e which is vertical tothe view plane.

    ON, l rr

  • 8/3/2019 Building Semi-Immersing Human-Computer Interaction Environment of VirtualTeleconferedE .i Ng Based on Multi-Scr

    5/5

    w0 M W SW F c luwcatm

    30Movy

    real meeting mom to combine the virtual environmentand real meeting rw m seamlessly.

    camea Mcphme


Recommended