+ All Categories
Home > Documents > REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin...

REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin...

Date post: 25-Mar-2019
Category:
Upload: dangnhi
View: 237 times
Download: 0 times
Share this document with a friend
41
REAL-TIME MOTION DATA SEQUENCING AND AGGREGATION IN VIRTUAL ENVIRONMENTS/LIVE PERFORMANCE By ANTON YUDIN SUPERVISORY COMMITTEE: OLIVERIO, JAMES CHARLES, CHAIR BARMPOUTIS, ANGELOS, MEMBER DEVANE, BENJAMIN, MEMBER A PROJECT IN LIEU OF THESIS PRESENTED TO THE COLLEGE OF FINE ARTS OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF ARTS UNIVERSITY OF FLORIDA 2011
Transcript
Page 1: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

REAL-TIME MOTION DATA SEQUENCING AND AGGREGATION INVIRTUAL ENVIRONMENTS/LIVE PERFORMANCE

By

ANTON YUDIN

SUPERVISORY COMMITTEE:

OLIVERIO, JAMES CHARLES, CHAIRBARMPOUTIS, ANGELOS, MEMBER

DEVANE, BENJAMIN, MEMBER

A PROJECT IN LIEU OF THESIS PRESENTED TO THE COLLEGE OF FINE ARTSOF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT

OF THE REQUIREMENTS FOR THE DEGREE OFMASTER OF ARTS

UNIVERSITY OF FLORIDA

2011

Page 2: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Summary of Project Option in Lieu of ThesisPresented to the College Of Fine Arts of the University Of Florida

in Partial Fulfillment of the Requirements for theDegree of Master Of Arts

REAL-TIME MOTION DATA SEQUENCING AND AGGREGATIONIN VIRTUAL ENVIRONMENTS/LIVE PERFORMANCE

By

Anton Yudin

August 2011

Chair: Oliverio, James CharlesMajor: Digital Arts and Science

This paper briefly describes an open framework that can route, sequence,

and aggregate data from different motion capture systems. The project uses

open standards like J2EE, JSON, and XML to achieve these goals. The

framework includes a custom protocol designed to send and receive motion

data, a router that manages connections from different sources and con-

sumers of motion data, a 3D environment that is used as a client for the

system, and a web-based user interface to configure the router of motion

data. The protocol and the router were optimized to be used in a dis-

tributed environment with multiple simultaneous clients and sources. The

system allows dancers and performing artists to collaborate in distributed

performances around the world utilizing computer networks.The working

system has been successfully demonstrated during several conferences and

shows, including eComm-2010, SIGGRAPH-2010, and iDMAA 2010.

Page 3: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

MORPHEUS

Help him, Trinity.

Neo allows himself to be helped into one of the chairs.

MORPHEUS

Do you remember when I asked you about an apparatus that could turn avirtual reality into reality?

Neo nods.

MORPHEUS

It’s right here.

He touches Neo’s head.

MORPHEUS

And it’s accessed here.

Larry and Andy WachowskiThe Matrix

1999

1

Page 4: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Contents

1 Introduction 41.1 Motion Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.2 3D/Virtual environments and Virtual Reality . . . . . . . . . 51.3 Human Interface Devices . . . . . . . . . . . . . . . . . . . . . 5

1.3.1 Types of Human Interface Devices . . . . . . . . . . . 51.3.2 Different ways to connect to a Human Interface Device 61.3.3 Available universal communication protocols and in-

terfaces . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2 Motivation 72.1 Open environment for science and arts experiments with mo-

tion data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.2 Requirements for a 3D environment . . . . . . . . . . . . . . 82.3 Requirements for a protocol . . . . . . . . . . . . . . . . . . . 8

2.3.1 Simplicity . . . . . . . . . . . . . . . . . . . . . . . . . 92.3.2 Extensibility . . . . . . . . . . . . . . . . . . . . . . . 9

2.4 Requirements for User Interface . . . . . . . . . . . . . . . . . 9

3 Implementation 113.1 Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.1.1 3D Environment . . . . . . . . . . . . . . . . . . . . . 113.1.2 Custom JSON/XML based protocol . . . . . . . . . . 113.1.3 Basic Messages . . . . . . . . . . . . . . . . . . . . . . 123.1.4 Serialization and Deserialization . . . . . . . . . . . . 133.1.5 Message Routing . . . . . . . . . . . . . . . . . . . . . 143.1.6 Scripting . . . . . . . . . . . . . . . . . . . . . . . . . 143.1.7 Web Based Technologies . . . . . . . . . . . . . . . . . 17

3.2 Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183.2.1 Complexity of the project . . . . . . . . . . . . . . . . 183.2.2 Protocol performance issues . . . . . . . . . . . . . . . 18

4 Applications 214.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

2

Page 5: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

4.2 International Symposium on Mixed and Augmented Reality(ISMAR), Orlando, USA - 2009 . . . . . . . . . . . . . . . . . 22

4.3 Emerging Communications Conference and Awards (eComm),San Francisco, California, USA - 2010 . . . . . . . . . . . . . 22

4.4 SIGGRAPH, Los Angeles, California, USA - 2010 . . . . . . . 224.5 The International Digital Media Arts Association Conference,

Vancouver, Canada - 2010 . . . . . . . . . . . . . . . . . . . . 224.6 Digital Arts Festival, Redmond, Washington, USA - 2011 . . 23

5 Acknowledgments 30

3

Page 6: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Chapter 1

Introduction

The emergent interest in geographically distributed performance has createdgreater demand for accessible software solutions that work in heterogeneousenvironments without introducing unwarranted of additional technical com-plexity to the overall system.

The goal of this project is to create a distributed, scalable, and openframework that allows researchers and artists to experiment in a 3D envi-ronment with motion data obtained from different human interface devices.

This framework is devided into four modules: a custom protocol forsending and receiving motion data, a router that manages connections, a3D environment that is used to visualize motion data, and a web-baseduser interface to configure the router. All four modules can be extended orsubstituted individually without compromising other parts of the system.The protocol is based on meta-languages XML and JSON. The router isbuilt as a web servlet and is managed by a J2EE web container. The userinterface uses a RESTful architecture.

1.1 Motion Data

Motion data can be represented by a sequence of “states” of a 3D object.For example, motion data obtained from a human body moving in a roommight be represented by a sequence of 3D transformations for each bodypart (head, hands, legs, etc.). This data might be used to recreate themovements of the person on the screen using an “avatar,” trigger musicalsounds or visual effects, or analyze problematic patterns in a walk that mightbe symptoms of a particular disease. Because there are virtually unlimitedways that motion data may be used, instead of creating a focused end userapplication, it would be more valuable to create a framework that can bemodified and extended to accomplish different goals in different applications.

4

Page 7: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

1.2 3D/Virtual environments and Virtual Reality

Today, 3D graphics, 3D games, Virtual Worlds and Virtual Reality are pop-ular topic, though the actual terms are sometimes misused.

3D graphics is a technology for rendering 3D scenes (buildings, trees,people, cars, furniture) on a 2D screen [1]. There are several ways to accom-plish this and from the software development perspective it means using a3D application programming interface (API) to a library that implementsthe basic concepts - render a shape, apply a texture, light the scene. Themost popular APIs are OpenGL and DirectX [2].

3D games are programs that use 3D graphics and let a player interactwith 3D scenes. 3D games cover a broad range of genres that includesaction, flight simulator, sport , puzzles, role-playing, card and chess games.Different genres put a different emphasis on the quality of the 3D graphics.For example, 3D action, sport, and simulator games are known to have highresolution and high detail graphics.

Virtual Worlds are special systems (usually distributed and accessibleonline) that let users engage in different activities in a shared simulatedenvironment. The main difference between Virtual Worlds and 3D games isthat usually in Virtual Worlds there are no goals and users do not “win”.Users might communicate or participate in a virtual activities, yet they donot score points. Virtual Worlds often are used to train a certain group ofpeople skills that are difficult or expensive to train in the real world.

One definition of Virtual Reality might be a simulated 3D scene wherethe user interacts with imaginary objects and the interaction mechanismtends to resemble the real mechanism as closely as possible. Modern inputand output devices (sometimes called human interface devices) that are oftenused in Virtual Reality include gloves, motion capture stages, haptic mice,and video goggles. There are different protocols and APIs that let differentparts of the system communicate with these devices.

1.3 Human Interface Devices

1.3.1 Types of Human Interface Devices

Human Interface Devices might be divided into two main categories: inputand output devices. Some of the devices might include both input andoutput characteristics. A good example of an input device is a standardcomputer mouse. A mouse input device detects movements and buttonspresses, encodes these events using a protocol, and sends this data to thecomputer. Video goggles are a good example of an output device. Videogoggles receive a signal sent from the computer and display the image forthe user.

A good example of a hybrid (input and output) human interface device

5

Page 8: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

is a Haptic Mouse. This device behaves like a regular mouse that sendsmovement events to the computer and also sends feedback from the com-puter to the user. This kind of devices lets the user “touch” a virtual shapeor “feel” different types of forces simulated by a program.

Motion capture systems are another example of a human interface de-vice. This type of an input device detects 3D positions of different partsof the user’s body in real-time and sends this data to the computer. Thereare different ways to accomplish this. Some motion capture systems usewearable sensors that constantly send motion data to the computer. Othermotion capture systems use markers and cameras that reconstruct the posi-tions of the human body. Even more advanced systems do not require anytype of special cloth or markers and reconstruct the positions of the humanbody by analyzing pure 2D images taken from several cameras.

1.3.2 Different ways to connect to a Human Interface Device

There are several ways to connect a computer to a human interface device.The most common ways include RS232, USB, and Ethernet connections.Some more complex systems, like motion capture systems, might use pro-prietary non-standard connections.

1.3.3 Available universal communication protocols and in-terfaces

There are several protocols that are used today to connect to human inter-face devices.

The most common is MIDI. MIDI was created in 1982 as a universal pro-tocol that connects digital musical instruments [3]. This protocol supportsup to 16 virtual channels that can be used to send and receive simple mes-sages, like keyboard press events. Even though MIDI supports extensions,it is considered obsolete because of its limitations.

Another popular protocol that is used to send events from different in-put devices is Open Sound Control (OSC). Unlike MIDI, OSC is based onTCP/IP. It supports four types of data that can be used to construct a mes-sage: integers, floats, c-style strings, and blobs. OSC protocol was inventedmainly to overcome the limitations of the MIDI protocol. Unlike MIDI, thisprotocol uses textual names for the parameters instead of numbers. Becauseof this feature, OSC protocol can be easily extended [4]. Extending MIDIprotocol requires the use of SYSEX messages that are not supported bystandard devices.

6

Page 9: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Chapter 2

Motivation

There are different packages that work with different sources of motion data.First, each motion capture system comes with its own software developmentkit (SDK) that allows a developer to receive data from the system usingproprietary and closed protocols. Another way is to use a package likeMotion Builder that works with different motion capture systems [5]. Un-fortunately, those kinds of systems are relatively expansive for researchersand artists, and difficult to extend. An extension to Motion Builder, forexample, requires C++ or Python programming skills.

Additionally, in the context of a distributed performance, different per-forming artists might wish to use different motion capture systems that arenot compatible with each other. In this case they would need a method thatworks with different types of motion data sources and allows the artists tointuitively collaborate in a common virtual environment.

This project tries to solve those issues by setting certain requirenmentsthat are discussed below.

2.1 Open environment for science and arts exper-iments with motion data

Lack of an open and highly customizable environment to experiment withreal-time motion data was the main motivation for building the system. Onecould argue that there are a lot of 3D engines, environments, and packages,yet there are no open and integrated environments that allow the researcherto experiment with motion data.

For example, there are several proprietary and open source game engineavailable today. Here is a short list of such engines.

Open source engines:

• Blender

• Id Tech 1,2,3, ioquake

7

Page 10: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

• Open Scene Graph

• Panda 3D

• OGRE

• JMonkeyEngine

• Unity 3D

• Unreal Engine

Proprietary engines:

• Unity 3D

• Unreal Engine

However, none of these packages can be easily connected to a motioncapture system. A developer would need to write a custom plug-in for theengine to be able to use the motion data in real-time. Some of the engines(Unity3D for example) require a special, often expensive license that allowsa developer to create custom plug-ins.

2.2 Requirements for a 3D environment

A system that can be used for experiments adds extra requirements:

• Low level access to 3D graphics APIs like OpenGL

• Scriptability using a simple and well known language

• Support for multiple platforms

2.3 Requirements for a protocol

There are several possible ways to connect from a 3D environment to amotion capture system. One way is to create a plug-in that uses a motioncapture systems SDK. Using this approach, the system would generate theminimum overhead associated with retrieving motion data from a motioncapture system. The main problem with this approach is that an updateto the motion capture system SDK or a switch to another motion capturesystem would require a plug-in update and potentially rebuilding the codethat uses the plug-in. The plug-in system would have to be sophisticatedenough to support new plug-ins/APIs without the need to rebuild the code.This challenge adds significantly more complexity to the system.

8

Page 11: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

The other way is to create a networking protocol that would allow a 3Denvironment to connect to and receive data from a motion capture systemthrough a network connection. A new module might be introduced to thesystem that will behave as a “bridge” between a motion capture system andthe 3D environment. While this approach adds overhead to the communi-cation, it also adds flexibility in switching between different motion capturesystems, changing APIs (as adding new functionality does not require re-building legacy code), and distribution of the motion capture systems andenvironments that use it.

This second approach was chosen as the basis of this project because ofits superior flexibility.

2.3.1 Simplicity

One of the requirements for the protocol is simplicity. The protocol mustbe simple enough to be easily adapted to different development environ-ments like C/C++, Java, C#, Python, and JavaScript. For example, abinary protocol would create difficulties for script languages like JavaScript.JavaScript, by default, does not have a convenient way to work with binarydata. A text based protocol, on the other hand, can be easily parsed inany language that has basic string based functions. Additionally, there areseveral text based meta languages that can be used as a foundation for theprotocol. The three most frequently used are CSV, XML, and JSON.

2.3.2 Extensibility

Another requirement for the protocol is extensibility. Adding new elementsto the protocol should not break the existing code. Achieving this extensibil-ity using a custom binary protocol requires a significant amount of coding.Using an existing meta language like XML or JSON, on the other hand, del-egates this to the numerous libraries already available. Additionally, mostof the modern languages like Java, C#, Python and JavaScript have built-instandard functions to work with these languages.

2.4 Requirements for User Interface

A developer has multiple options when it comes to the user interface. Theuser interface can be a standard conventional interface based on awt/swing(Java) or Windows Forms( C#) or it can be a web based interface. Theweb interface can be a RESTful-based interface or a standard HTML basedinterface.

The standalone application, with a conventional user interface, has allof the benefits of an application that has access to an operating/window-ing system of the user‘s computer. The web interface, on the other hand,

9

Page 12: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

does not require that the user install any additional software and allowsinstantaneous access to the system.

A RESTful-based web user interface was chosen because of the dis-tributed nature of the project itself.

10

Page 13: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Chapter 3

Implementation

3.1 Details

3.1.1 3D Environment

Before making a final decision about a 3D framework that might be used forthis project, the author looked at several existing options including: Ogre3D,JMonkeyEngine 2/3, Unity 3D, and Id Tech 1/2/3.

All of these frameworks are available for free and most of them comewith an open source code which is very important for avoiding limitationswhile experimenting with different technologies. Unity 3D was eliminatedas a choice because of its dependency on the development environment.Additionally Unity 3D requires the purchase of a special license to haveplug-in support, which would be essential for working with motion capturesystems. Id Tech engines looked overly complex and would have requireda lot of time to learn how to use them. Ogre3D looked promising, yet,because it is considered just a rendering engine and does not provide audioand physics support, it was not chosen as a platform. JMonkeyEngine, onthe other hand, had built-in physics support, 3D audio, and was written inJava which makes it a truly multi-platform solution. Additionally, becauseJMonkeyEngine is a Java based project, adding features like networking andscripting is a trivial task.

3.1.2 Custom JSON/XML based protocol

The protocol for this system is based on the concept of different “states”of different “sources”. For example, a chair in a 3D space is considered asource and its 3D position is considered a state. If a chair is moved fromone position to another, its state is changed.

Each state is represented as an XML or JSON message [6]. Each messagehas a reference to its source. This lets the system differentiate messagesfrom different sources in one stream of data. Additionally, messages might

11

Page 14: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

be combined in containers. That lets the system have more than one “state”per “source”. For example, a container message might include 3D positionsfor left and right hands of a person. In this case, the container message hasthe “person” as the source, while messages that represent 3D positions ofthe left and right hands have “left hand” and “right hand” as their sourcesrespectively.

3.1.3 Basic Messages

The basic messages include:

• ButtonState

• TransformationState

• TransformationContainerState

• MessageState

• ContainerState

ButtonState is a message that represents a state of a button. The buttonmight be pressed or unpressed. The interface of this state includes only onemethod:

Listing 3.1: ButtonState methods

pub l i c boolean i sP r e s s ed ( ) ;

TransformationState contains information about 2D/3D transformationsfor an object. The transformation might be represented as an array of angles,translations and scaling or as a transformation matrix.

Listing 3.2: TransformationState methods

pub l i c double [ ] getAngles ( ) ;pub l i c double [ ] g e tTrans l a t i on ( ) ;pub l i c double [ ] g e tS ca l i n g ( ) ;

pub l i c double [ ] [ ] getTrans format ion ( ) ;

TransformationContainerState is a compact version of a list of Transfor-mationState messages. For example, if a 3D object contains several “parts”,and each part has its own transformation parameters, then all of the datamay be represented as one message. TransformationContainerState has anarray of names and a sequence of all numbers of all matrices.

Listing 3.3: TransformationContainerState methods

pub l i c i n t getDimension ( ) ;pub l i c S t r ing [ ] g e t I d e n t i t i e s ( ) ;pub l i c double [ ] getValues ( ) ;

12

Page 15: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

MessageState contains a textual message associated with a source.

Listing 3.4: MessageState methods

pub l i c Message getContent ( ) ;

ContainerState contains a list of states that are embedded inside thismessage.

Listing 3.5: ContainerState methods

pub l i c L i s t<State> ge tS ta t e s ( ) ;

3.1.4 Serialization and Deserialization

The process of converting java objects that represents “states” to the streathem of bytes (serialization) and the process of converting a stream of bytesback to the java objects (deserialization) is delegated to well known libraries:Java Architecture for XML Binding and Jackson JSON java library. Allstate classes are annotated using JAXB and Jackson annotations [7]. Forexample, to deserialize a stream of bytes to java objects one would use thiscode:

Listing 3.6: De-serialization of a stream of bytes

JAXBContext context = JAXBContext . newInstance (mani fo ld . s t a t e s . jaxb . ButtonState . c l a s s ,mani fo ld . s t a t e s . jaxb . Conta inerState . c l a s s ,mani fo ld . s t a t e s . jaxb . MessageState . c l a s s ,mani fo ld . s t a t e s . jaxb . Trans format ionState . c l a s s ,mani fo ld . s t a t e s . jaxb . Trans format ionConta inerState . c l a s s

) ;

Conta inerState conta ine r = ( Conta inerState )context . c reateUnmarsha l l e r ( ) . unmarshal ( inputStream ) ;

The input message in XML format:

Listing 3.7: XML message

<?xml ve r s i on=” 1 .0 ” encoding=”UTF−8”?><n s1 : c on t a i n e rS t a t e

xmlns:ns1=” ht tp : // toha . org . ua/mani fo ld / s t a t e s ”n s 1 : i d e n t i t y=” track−0”n s 1 : o r i g i n a t e d=”2010−12−15T15:05:56 .810−05 :00 ”

><ns1 : t rans f o rmat i onConta ine rS ta t e

ns1 :d imens ion=”4”n s 1 : i d e n t i t y=”main”n s 1 : o r i g i n a t e d=”2010−12−15T15:03:22 .511−05 :00 ”

><n s 1 : i d e n t i t i e s>

UpperChest MidTorso LowerTorso LeftUpperLegLeftLowerLeg LeftFoot RightUpperLeg RightLowerLegRightFoot Neck Head RightC lav i c l e RightShoulder

13

Page 16: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

RightUpperArm RightLowerArm RightHand Le f tC l a v i c l eLe f tShou lder LeftUpperArm LeftLowerArmLeftHand

</ n s 1 : i d e n t i t i e s><ns1 : s ou r c e n s 1 : i d e n t i t y=” c l i e n t −1”/><n s1 : v a l u e s>

393 −918 57 87 916 396 63 −67 −80 28 996 1206 0 0 01000 194 −979 63 78 978 189 −86 −65 72 79 994 10560 0 0 1000 46 −998 51 70 998 44 −31 −56 29 52 998. . .0 0 0 1000 −719 −599 354 2 256 245 935 −554−647 762 −23 1154 0 0 0 1000 −719 −599 354−63 256 245 935 −725 −647 762 −23 1158 0 0 0 1000

</ n s1 : v a l u e s></ ns1 : t r ans f o rmat i onConta ine rS ta t e>

</ n s 1 : c on t a i n e rS t a t e>

Example of a JSON message:

Listing 3.8: JSON Message

{” trans fo rmat ionConta iner ” :{” source ” :

{”Source ” :{ ” i d e n t i t y ” : ” c l i e n t −0” }} ,” va lue s ” : [

−0.306 , −0.951 , −0.029 , 0 . 0 ,0 . 0 , 0 . 0 , 0 ,0 , 0 . 0 ,0 . 0 , 0 . 0 , 0 . 0 , 0 . 0 ,0 . 0 , 0 . 0 , 0 . 0 , 0 . 0

] ,” dimension ” : 4 ,” o r i g i n a t ed ” :1309836222186 ,” i d e n t i t y ” : ”main” ,” i d e n t i t i e s ” : [ ”UpperChest” ]

}}

3.1.5 Message Routing

Message routing is built around “Router” interface. The Router interfacecontains a list of Consumers and Providers. Each Provider might call a“broadcast” method to send messages. Each Consumer that is routed witha broadcasting Provider receives messages in the “process” method. Thedefault implementation of the router interface is provided in listing 5.1.

The class diagram of the routing package is shown in figure 3.1.

3.1.6 Scripting

Creating a 3D application usually is a process that requires knowledge of3D API’s like OpenGL or Direct3D. 3D game engines significantly simplifythis process. Going further, scripting makes the process of creating a 3D

14

Page 17: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

application even simpler is scripting. Most of the modern 3D game enginessupport some sort of scripting. Yet the open source free game engine thatwas chosen for this project does not support scripting by default. To solvethis problem and make the system easier to use, Java Scripting API wasintegrated with the engine. That allows the user to use beanshell, browserjs,ejs, freemarker, groovy, jacl, jaskell, java, javascript, jawk, jelly, jep, jexl,jruby, jst, judo, juel, jython, list, ognl, pnuts, scheme, velocity, xpath, andxslt as a scripting language to customize and extend the system.

The scripting is responsible for all interaction in the system. For ex-ample, the matrices that are received from the motion capture system areapplied using JavaScript language. That makes it easy to adjust the logicwithout the need to recompile the java code and can be done without in-stalling any additional software.

An example of the code that applies transformation matrices receivedfrom the motion capture system is provided in listing 3.9.

Listing 3.9: Motion data applied to a skeleton

1 t h i s . update = func t i on ( t rans f o rmat i ons ) {2 i f ( t h i s . c on t r o l == nu l l ) {3 l o gg e r . warning ( ” e r r o r : MotionModel . c on t r o l i s nu l l . ” ) ;4 re turn ;5 }6 var bone = th i s . c on t r o l . g e tSke l e ton ( ) . getBone ( ”Head” ) ;7 i f ( ! bone ) {8 l o gg e r . warning ( ” e r r o r : MotionModel does not have bone Head . ” ) ;9 re turn ;

10 }11 bone = th i s . c on t r o l . g e tSke l e ton ( ) . getBone ( ” root ” ) ;12 i f ( bone != nu l l ) {13 bone . setBindTransforms (14 t h i s . zeroVector ,15 t h i s . zeroQuaternion ,16 t h i s . i d en t i t yVec to r17 ) ;18 }19 i f ( t r ans f o rmat i ons != nu l l ) {20 t h i s . ready = true ;21

22 var q = new Quaternion ( ) ;23 var vector1 = new Vector3 f (1 , 1 , 1 ) ;24 var vec to r = new Vector3 f ( ) ;25

26 f o r ( var i in t rans f o rmat i ons ) {27 var t rans fo rmat ion = trans f o rmat i ons [ i ] ;28 q . fromRotationMatrix (29 t rans fo rmat ion . getMatrix ( ) [ 0 ] [ 0 ] ,30 t rans fo rmat ion . getMatrix ( ) [ 0 ] [ 1 ] ,31 t rans fo rmat ion . getMatrix ( ) [ 0 ] [ 2 ] ,32 t rans fo rmat ion . getMatrix ( ) [ 1 ] [ 0 ] ,33 t rans fo rmat ion . getMatrix ( ) [ 1 ] [ 1 ] ,

15

Page 18: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

34 t rans fo rmat ion . getMatrix ( ) [ 1 ] [ 2 ] ,35 t rans fo rmat ion . getMatrix ( ) [ 2 ] [ 0 ] ,36 t rans fo rmat ion . getMatrix ( ) [ 2 ] [ 1 ] ,37 t rans fo rmat ion . getMatrix ( ) [ 2 ] [ 2 ]38 ) ;39 bone = th i s . c on t r o l . g e tSke l e ton ( ) . getBone ( t rans fo rmat ion . getName ( ) ) ;40 i f ( ! bone ) {41 l o gg e r . warning ( ” e r r o r : cannot f i nd bone : [ ” + bone + ” ] ” ) ;42 cont inue ;43 }44 bone . setBindTransforms (45 t h i s . zeroVector ,46 t h i s . zeroQuaternion ,47 t h i s . i d en t i t yVec to r48 ) ;49 bone . se tUserContro l ( t rue ) ;50 vec to r . s e t (51 t rans fo rmat ion . getMatrix ( ) [ 0 ] [ 3 ] ,52 t rans fo rmat ion . getMatrix ( ) [ 1 ] [ 3 ] ,53 t rans fo rmat ion . getMatrix ( ) [ 2 ] [ 3 ]54 ) ;55 bone . setUserTransforms (56 vector ,57 q ,58 vector159 ) ;60 var c o l l i d e r = th i s . c o l l i d e r s [ ( ’ ’ + trans fo rmat ion . getName ( ) ) ] ;61 i f ( c o l l i d e r ) {62 c o l l i d e r . physicsNode . s e tLoca lRotat i on (q ) ;63 c o l l i d e r . physicsNode . s e tLoca lTran s l a t i on (64 new Vector3 f (65 t rans fo rmat ion . getMatrix ( ) [ 0 ] [ 3 ] ,66 t rans fo rmat ion . getMatrix ( ) [ 1 ] [ 3 ] ,67 t rans fo rmat ion . getMatrix ( ) [ 2 ] [ 3 ]68 )69 ) ;70 }71 i f (72 ( ’ ’ + trans fo rmat ion . getName ( ) ) == ’MidTorso ’73 ) {74

75 var matrix = new Matr ix4f ( ) ;76 matrix . setRotat ionQuatern ion (q ) ;77 matrix . s e tTran s l a t i on (78 new Vector3 f (79 t rans fo rmat ion . getMatrix ( ) [ 0 ] [ 3 ] +80 t h i s . g e tLoca lTrans l a t i on ( ) . get ( 0 ) ,81 t rans fo rmat ion . getMatrix ( ) [ 1 ] [ 3 ] +82 t h i s . g e tLoca lTrans l a t i on ( ) . get ( 1 ) ,83 t rans fo rmat ion . getMatrix ( ) [ 2 ] [ 3 ] +84 t h i s . g e tLoca lTrans l a t i on ( ) . get (2 )85 )86 ) ;87

16

Page 19: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

88 var r o t a t i on = new Quaternion ( ) ;89 // ROTATION90 r o t a t i on . fromAngles (Math . PI , Math . PI , 0 ) ;91

92 q . multLocal ( r o t a t i on ) ;93

94 matrix . setRotat ionQuatern ion (q ) ;95

96 matrix = matrix . mult ( t h i s . cameraShi f t ) ;97

98 i f ( t h i s . secondView != nu l l ) {99 var secondViewCamera = th i s . secondView . getCamera ( ) ;

100 secondViewCamera . s e tLoca t i on ( matrix . toTrans la t ionVector ( ) ) ;101 }102

103 i f (104 ( t h i s . cameraControlEnabled )105 ) {106 camera . s e tRotat ion ( matrix . toRotationQuat ( ) ) ;107 camera . s e tLocat i on ( matrix . toTrans la t ionVector ( ) ) ;108 }109 }110 }111 } e l s e112 l o gg e r . warning ( ” t rans f o rmat i ons f o r [ ” + th i s . name + ” ] i s nu l l ” ) ;113 }

3.1.7 Web Based Technologies

Instead of using a standard graphical user interface, the system uses a webbased user interface. This makes it possible to use the system from differentcomputers connected though a TCP/IP network. Additionally, there is noneed to install additional software to use the system.

Java 2 Enterprise Edition (J2EE) environment was chosen to make thesystem scalable and portable. J2EE brings several benefits to the overallarchitecture of the system such as a flexible security model, scalability, and“freedom of choice in servers, tools, and components” [8].

The core component of the system is a StateRoutingServlet class thatroutes the state messages from different parts of the system. For example,several motion capture systems send their data to the StateRoutingServletwhere this data is aggregated and prepared for clients that need not knowanything about specific motion capture systems. The clients access motiondata using logic names or aliases.

The user interface to control the StateRoutingServlet is implemented us-ing RESTful architecture. The Web client is written in JavaScript language.The server part is represented as a JAX-RS endpoint. JBoss and Tomcatare used as an enterprise/web container. The overall deployment diagramis shown in 3.2.

17

Page 20: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Figure 3.1: Routing Class Diagram

3.2 Challenges

3.2.1 Complexity of the project

One of the challenges for the system was lack of specific requirenmentsmainly because of the research nature of the project. The first version of thesystem did not support several sources of the motion data working simul-taneously, data compression, and message routing. All those features wereadded later and required significat refactoring of the design of the system aswell as the code itself.

3.2.2 Protocol performance issues

Supporting slow connections and limiting required bandwidth were anotherchallenges for the protocol design. Those issues were solved by using astandard “deflate” compression supported by most of the HTTP clientsand servers. Table 3.1 contains average packet sizes for a 21-bone skeleton.

18

Page 21: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Figure 3.2: Deployment Diagram

19

Page 22: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

CSV JSON XML

frame type initial live initial live initial live

no compression 3258 3254 2458 5764 1619 1863

deflate compression 399 1064 537 2381 435 834

Table 3.1: Packet Sizes

Using a 24 frames per second frame rate, one stream of motion data requiresapproximately a 60KB/1 sec connection.

Additional optimization might be achieved by using a binary version ofthe XML [9].

20

Page 23: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Chapter 4

Applications

4.1 History

The idea of a distributed performance is not new. For example, Kit Gallowayand Sherrie Rabinowitz worked on a show in 1975-1977 called “The SatelliteArts project”. The experimenters used satellite telecommunication as a wayto transmit video of several dancers located in different places around theworld that performed simultaneously [10].

In 2001, the Digital Worlds Institute at the University Of Florida pro-duced an impressive and technically complex demostration at the Global Su-percomputing Conference called “Dancing Beyond Boundaries”. It allowedanimation artistis, dancers, musicians, computer scientists, and engineerslocated in different cities in North and South America to perform togethersynchronously utilizing the internet[11].

Another example of a more recent use of telecommunication to allowdancers to perform in a shared virtual environment is a study conducted in2007 by a team of researchers and artists from University of Illinois, Urbana,IL University of California, Berkeley, CA, and Mills College, Oakland, CA[12]. The study used a custom built motion capture system and a commu-nication mechanism that allowed several dancers to collaborate in a sharedvirtual environment.

Another way to allow artists to perform in a shared virtual environmentwas presented by the Digital Worlds Institute in 2007 during “INGENU-ITY 2007” [13]. The artists used several video streams and a network basedmetronome-like technology called “NetroNome” to synchronize the perfor-mance.

This project tries to extend the possibilities of a shared virtual envi-ronment used for real-time performance by providing an open framework.Examples of applications of the framework that was developed as a resultof this project are provided below.

21

Page 24: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

4.2 International Symposium on Mixed and Aug-mented Reality (ISMAR), Orlando, USA - 2009

The first public demonstration of the very first version of the system wasdone during the International Symposium on Mixed and Augmented Realityin 2009, Orlando, USA. The demo included a 3D scene with a room and adoor. A 3D avatar located inside the room was control in real-time by aperson from University Of Florida. The 3D avatar was able to “open” thedoor by slowly approaching it with the left hand. The system used primitivecollision detection algorithm to trigger the event that opened the door. Thedemo and the system at this moment were very primitive, yet it showed firstchallenges that the system was facing: collision detection system needed tobe improved, the protocol needed to be redesigned to support reconnectionof the sources and clients without breaking the communication.

4.3 Emerging Communications Conference and Awards(eComm), San Francisco, California, USA -2010

In 2010, At the Emerging Communications Conference and Awards, SanFrancisco, USA, the system included a new module that allowed collisionevents to be delivered to external applications using the Open Sound Controlprotocol.

4.4 SIGGRAPH, Los Angeles, California, USA -2010

At the SIGGRAPH, 2010 the system started to support a first person view.VGA goggles were used to display in real-time the 3D environment to theperson that was inside the motion capture system.

4.5 The International Digital Media Arts Associ-ation Conference, Vancouver, Canada - 2010

During the International Digital Media Arts Association Conference in Van-couver in 2010, the system worked simultaneously with 4 motion data sourcesdistributed around the world: Vancouver, Canada, Tokyo, Japan, New York,New York, and Gainesville, Florida.

The system was tracking collisions between the avatars and communi-cating these events to a PureData (PD) that generated sound effects using

22

Page 25: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

5.1 channels audio system. OSC was used as the protocol to communicatewith PD.

4.6 Digital Arts Festival, Redmond, Washington,USA - 2011

23

Page 26: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Figure 4.1: Emerging Communications Conference And Awards, San Fran-cisco, California, USA - 2010, April 21

24

Page 27: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Figure 4.2: SIGGRAPH, Los Angeles, California, USA - 2010

25

Page 28: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Figure 4.3: The International Digital Media Arts Association Conference,Rehearsal, Vancouver, Canada - 2010

26

Page 29: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Figure 4.4: The International Digital Media Arts Association Conference,Rehearsal, Vancouver, Canada - 2010

27

Page 30: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Figure 4.5: The International Digital Media Arts Association Conference,Show, Vancouver, Canada - 2010

28

Page 31: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Figure 4.6: Digital Arts Festival, Redmond, Washington, USA - 2011

29

Page 32: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Chapter 5

Acknowledgments

I would like to thank Dr. James Oliverio for all of his help and support.I would like to thank Dr. James Oliverio, Angelos Barmpoutis, and Ben

DeVane for serving in the committee and providing the feedback.Also I would like to thank Arturo Sinclair for the collaboration on this

project, Partick Pagano for the collaboration and encouragement, AngelosBarmpoutis for advising me to use *TeX for writing this paper, JosephMurphy for the invaluable advice on choosing the right software packagesand for the feedback, Garrett Strobel for reviewing my paper, and the DigitalWorlds Institute for the resources and friendly environment.

30

Page 33: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

List of Tables

3.1 Packet Sizes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

31

Page 34: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

List of Figures

3.1 Routing Class Diagram . . . . . . . . . . . . . . . . . . . . . 183.2 Deployment Diagram . . . . . . . . . . . . . . . . . . . . . . . 19

4.1 Emerging Communications Conference And Awards, San Fran-cisco, California, USA - 2010, April 21 . . . . . . . . . . . . . 24

4.2 SIGGRAPH, Los Angeles, California, USA - 2010 . . . . . . . 254.3 The International Digital Media Arts Association Conference,

Rehearsal, Vancouver, Canada - 2010 . . . . . . . . . . . . . . 264.4 The International Digital Media Arts Association Conference,

Rehearsal, Vancouver, Canada - 2010 . . . . . . . . . . . . . . 274.5 The International Digital Media Arts Association Conference,

Show, Vancouver, Canada - 2010 . . . . . . . . . . . . . . . . 284.6 Digital Arts Festival, Redmond, Washington, USA - 2011 . . 29

32

Page 35: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Listings

3.1 ButtonState methods . . . . . . . . . . . . . . . . . . . . . . . 123.2 TransformationState methods . . . . . . . . . . . . . . . . . . 123.3 TransformationContainerState methods . . . . . . . . . . . . 123.4 MessageState methods . . . . . . . . . . . . . . . . . . . . . . 133.5 ContainerState methods . . . . . . . . . . . . . . . . . . . . . 133.6 De-serialization of a stream of bytes . . . . . . . . . . . . . . 133.7 XML message . . . . . . . . . . . . . . . . . . . . . . . . . . . 133.8 JSON Message . . . . . . . . . . . . . . . . . . . . . . . . . . 143.9 Motion data applied to a skeleton . . . . . . . . . . . . . . . . 155.1 Router implementation . . . . . . . . . . . . . . . . . . . . . . 36

33

Page 36: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Bibliography

[1] T. Mitra and T. Chiueh. Three-dimensional computer graphics archi-tecture. Current Science, 78(7):838, April 2000.

[2] M. Valient. Introduction to 3d game engines. April 2001.

[3] The complete midi 1.0 detailed specification.http://www.midi.org/techspecs/midispec.php, 1995-2011.

[4] D. Phillips. An introduction to osc.http://www.linuxjournal.com/content/introduction-osc, November2008.

[5] Motion builder - real-time 3d character animation software - autodesk.www.autodesk.com/motionbuilder, 2011.

[6] Introducing json. http://www.json.org/.

[7] Jsr-000222 java(tm) architecture for xml binding (jaxb).http://jcp.org/aboutJava/communityprocess/mrel/jsr222/index.html,December 2009.

[8] Designing Enterprise Applications with J2EE Platform, page 10.Addison-Wesley, second edition, 2002.

[9] Efficient xml interchange working group.http://www.w3.org/XML/EXI/, April 2011.

[10] Rhizome, aesthetic research in telecommunications.http://rhizome.org/editorial/2007/jun/11/aesthetic-research-in-telecommunications/, 2011.

[11] Dancing beyond boundaries. http://www.digitalworlds.ufl.edu/research/DBB/default.asp,November 2001.

[12] New digital options in geographically distributed dance collaborationswith teeve: Tele-immersive environments for everybody. 2007.

34

Page 37: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

[13] In common time: Maximum impact.http://www.digitalworlds.ufl.edu/research/maximumimpact/default.asp,July 2007.

35

Page 38: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

Listing 5.1: Router implementation

1 pub l i c c l a s s DefaultRouter implements Router {2

3 protec ted DefaultRouter ( ) {4 }5

6 pub l i c s t a t i c Router newInstance ( ) {7 re turn new DefaultRouter ( ) ;8 }9

10

11 pr i va t e f i n a l L i s t<Consumer> consumers = new ArrayList<Consumer>() ;12

13 pub l i c void reg i sterConsumer (Consumer consumer ) {14 synchron ized ( p rov ide r s ) {15 consumers . add ( consumer ) ;16 }17 }18

19 pub l i c void unregisterConsumer (Consumer consumer ) {20 synchron ized ( p rov ide r s ) {21 i f ( ! consumers . conta in s ( consumer ) )22 throw new I l l ega lArgumentExcept ion (23 ”consumer i s not r e g i s t e r e d ”24 ) ;25

26 List<Provider> r e g i s t e r edRoute s = new ArrayList<Provider >() ;27

28 f o r ( Prov iderState p rov ide rS ta t e : p r ov id e rS ta t e s . va lue s ( ) ) {29 f o r (Consumer c : p rov ide rS ta t e . g e tProc e s s o r s ( ) . keySet ( ) ) {30 i f ( c == consumer )31 r e g i s t e r edRoute s . add ( prov ide rS ta t e . ge tProv ider ( ) ) ;32 }33 }34

35 f o r ( Provider prov ide r : r e g i s t e r edRoute s ) {36 removeRoute ( provider , consumer ) ;37 }38

39 consumers . remove ( consumer ) ;40

41 l o gg e r . i n f o ( ”consumer [ ” + consumer + ” ] has been un r eg i s t e r ed ” ) ;42 }43 }44

45

46 pub l i c L i s t<Consumer> getAllConsumers ( ) {47 List<Consumer> r e s u l t = new ArrayList<Consumer>() ;48

49 synchron ized ( p rov ide r s ) {50 r e s u l t . addAll ( consumers ) ;51 }52

36

Page 39: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

53 re turn r e s u l t ;54 }55

56

57 pr i va t e f i n a l Map<Str ing , Provider> prov ide r s = new HashMap<Str ing , Provider >() ;58

59 pub l i c void r e g i s t e rP r o v i d e r ( Provider prov ide r ) {60

61 synchron ized ( p rov ide r s ) {62 i f ( p rov ide r s . containsKey ( prov ide r . g e t I d en t i t y ( ) ) )63 throw new I l l ega lArgumentExcept ion ( ” prov ide r i s a l r eady r e g i s t e r e d ” ) ;64 prov ide r s . put ( prov ide r . g e t I d en t i t y ( ) , p rov ide r ) ;65 }66

67 l o gg e r . i n f o ( ” prov ide r [ ” + prov ide r + ” ] has been r e g i s t e r e d ” ) ;68 }69

70 pub l i c void un r eg i s t e rP rov id e r ( Provider prov ide r ) {71

72 synchron ized ( p rov ide r s ) {73

74 i f ( ! p rov ide r s . containsKey ( prov ide r . g e t I d en t i t y ( ) ) )75 throw new I l l ega lArgumentExcept ion ( ” prov ide r i s not r e g i s t e r e d ” ) ;76

77 Prov iderState p rov ide rS ta t e = prov id e rS ta t e s . get ( prov ide r ) ;78

79 i f ( p rov ide rS ta t e != nu l l ) {80 f o r ( Proces sor p ro c e s s o r : p rov ide rS ta t e . g e tProc e s s o r s ( ) . va lue s ( ) )81 prov ide r . removeProcessor ( p ro c e s s o r ) ;82 }83

84 prov ide r s . remove ( prov ide r . g e t I d en t i t y ( ) ) ;85 prov id e rS ta t e s . remove ( prov ide r ) ;86 }87

88 l o gg e r . i n f o ( ” prov ide r [ ” + prov ide r + ” ] has been un r eg i s t e r ed ” ) ;89 }90

91

92 pub l i c L i s t<Provider> ge tA l lP rov ide r s ( ) {93 List<Provider> r e s u l t = new ArrayList ( ) ;94

95 synchron ized ( p rov ide r s ) {96 f o r ( Provider prov ide r : p rov ide r s . va lue s ( ) )97 r e s u l t . add ( prov ide r ) ;98 }99

100 re turn r e s u l t ;101 }102

103 pub l i c Provider getProv iderByIdent i ty ( S t r ing i d e n t i t y ) {104

105 synchron ized ( p rov ide r s ) {106 re turn prov ide r s . get ( i d e n t i t y ) ;

37

Page 40: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

107 }108

109 }110

111

112 pr i va t e Map<Provider , Prov iderState> prov id e rS ta t e s =113 new HashMap<Provider , Prov iderState >() ;114

115

116 pub l i c void addRoute ( Provider provider , f i n a l Consumer consumer ) {117

118 i f ( p rov ide r == nu l l )119 throw new I l l ega lArgumentExcept ion ( ” cannot c r e a t e route with nu l l p rov ide r ” ) ;120

121 i f ( consumer == nu l l )122 throw new I l l ega lArgumentExcept ion ( ” cannot c r e a t e route with nu l l consumer” ) ;123

124 synchron ized ( p rov ide r s ) {125

126 Prov iderState p rov ide rS ta t e = prov id e rS ta t e s . get ( prov ide r ) ;127

128 i f ( p rov ide rS ta t e == nu l l ) {129 prov ide rS ta t e = new Prov iderState ( prov ide r ) ;130 prov id e rS ta t e s . put ( provider , p rov ide rS ta t e ) ;131 }132

133 i f ( p rov ide rS ta t e . g e tProc e s s o r s ( ) . containsKey ( consumer ) )134 throw new I l l ega lArgumentExcept ion ( ” route a l r eady e x i s t s ” ) ;135

136 Proces sor p ro c e s s o r = new Proces sor ( ) {137 pub l i c void proce s s ( State [ ] s t a t e s , Provider prov ide r ) {138 consumer . p roce s s ( s t a t e s , p rov ide r ) ;139 }140 } ;141

142 prov ide rS ta t e . g e tProc e s s o r s ( ) . put ( consumer , p ro c e s s o r ) ;143

144 prov ide r . addProcessor ( p ro c e s s o r ) ;145 }146

147 l o gg e r . i n f o ( ” route [ ” + prov ide r + ” −> ” + consumer + ” ] has been added . ” ) ;148 }149

150 pub l i c void removeRoute ( Provider provider , Consumer consumer ) {151

152 synchron ized ( p rov ide r s ) {153

154 Prov iderState p rov ide rS ta t e = prov id e rS ta t e s . get ( prov ide r ) ;155

156 i f ( p rov ide rS ta t e == nu l l )157 throw new I l l ega lArgumentExcept ion ( ” route does not e x i s t s (1 ) ” ) ;158

159 Proces sor p ro c e s s o r = prov ide rS ta t e . g e tProc e s s o r s ( ) . get ( consumer ) ;160

38

Page 41: REAL-TIME MOTION DATA SEQUENCING AND …ufdcimages.uflib.ufl.edu/AA/00/00/49/60/00001/Anton Yudin -thesis... · real-time motion data sequencing and aggregation in virtual environments/live

161 i f ( p r o c e s s o r == nu l l )162 throw new I l l ega lArgumentExcept ion ( ” route does not e x i s t s (2 ) ” ) ;163

164 prov ide r . removeProcessor ( p ro c e s s o r ) ;165

166 prov ide rS ta t e . g e tProc e s s o r s ( ) . remove ( consumer ) ;167

168 }169

170 l o gg e r . i n f o ( ” route [ ” + prov ide r + ” −> ” + consumer + ” ] has been removed . ” ) ;171 }172

173 pub l i c Map<Provider , L i s t<Consumer>> getRoutes ( ) {174 Map<Provider , L i s t<Consumer>> r e s u l t = new HashMap<Provider , L i s t<Consumer>>();175

176 synchron ized ( p rov ide r s ){177 f o r (Map. Entry<Provider , Prov iderState> entry : p rov id e rS ta t e s . entrySet ( ) ) {178 List<Consumer> consumers = new ArrayList<Consumer>() ;179 consumers . addAll ( entry . getValue ( ) . g e tProc e s s o r s ( ) . keySet ( ) ) ;180 r e s u l t . put ( entry . getKey ( ) , consumers ) ;181 }182 }183

184 re turn r e s u l t ;185 }186

187 protec ted c l a s s Prov iderState {188

189 pr i va t e f i n a l Provider prov ide r ;190

191 pub l i c Prov iderState ( Provider prov ide r ) {192 t h i s . p rov ide r = prov ide r ;193 }194

195 pub l i c Provider getProv ider ( ) {196 re turn prov ide r ;197 }198

199

200 pr i va t e f i n a l Map<Consumer , Processor> p ro c e s s o r s =201 new HashMap<Consumer , Processor >() ;202

203 pub l i c Map<Consumer , Processor> ge tProc e s s o r s ( ) {204 re turn p r o c e s s o r s ;205 }206 }207

208 }

39


Recommended