+ All Categories
Home > Documents > TUIO AS3: A Multi-Touch and Tangible User Interface Rapid...

TUIO AS3: A Multi-Touch and Tangible User Interface Rapid...

Date post: 05-Sep-2019
Category:
Upload: others
View: 4 times
Download: 0 times
Share this document with a friend
8
Sensyble Workshop 2010 manuscript No. (will be inserted by the editor) TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit for Tabletop Interaction Johannes Luderschmidt 1 , Immanuel Bauer 2 , Nadia Haubner 1 , Simon Lehmann 1 , Ralf D¨ orner 1 , Ulrich Schwanecke 1 1 RheinMain University of Applied Sciences, Wiesbaden R¨ usselsheim Geisenheim, Department of Design, Computer Science and Media, Kurt-Schumacher-Ring 18, 65197 Wiesbaden, Germany 2 Vienna University of Technology, Institute for Design & Assessment of Technology, Human Computer Interaction Group, Favoritenstrae 9-11/3, 1040 Wien, Austria Received: date / Revised version: date Abstract Multi-touch and tangible input paradigms provide new opportunities for post-WIMP (Windows, Icons, Menus, Pointer) user interfaces. The development of such novel interfaces challenges designers and soft- ware developers alike. TUIO AS3 addresses these chal- lenges by providing a toolkit for the rapid development of multi-touch and tangible user interface (TUI) interac- tion. The TUIO AS3 toolkit comprises two basic func- tionalities. Firstly, it offers a sophisticated multi-touch and TUI interaction API. Secondly, to foster the de- velopment process, TUIO AS3 is capable to simulate complex multi-touch and TUI interaction via mouse and keyboard. In terms of the interaction API, TUIO AS3 provides a wrapper for the network-based open source protocol TUIO in Adobe Flash’s programming language Actionscript 3 (AS3). TUIO AS3 allows to enhance user interface (UI) elements developed in AS3 with interac- tivity for gestural and tangible interaction. The inter- action APIs support two kinds of interaction: On one hand, multi-finger/multi-hand controls for dragging, ro- tation and scaling can be added to any UI element to enhance it with standard multi-touch functionality. On the other hand, complex gestures can be defined with a simple grammar and tracked within TUIO AS3. In terms of simulation, TUIO AS3 offers means to simulate multi- touch and TUI interaction without the need for an ad- ditional simulator application. Aspects like multi-finger flicking, multi-finger rotation and complex gestures are supported via keyboard and mouse interaction. Tangi- bles can be added, manipulated and removed. TUIO AS3 has been used and matured in several projects. 1 Introduction Tabletop computing UIs that employ multi-touch and tangible user interaction differ from WIMP interfaces in several aspects. Firstly, the interaction with the interface can be performed with multiple fingers and real-life ob- jects called tangibles in a TUI instead of using a mouse pointer. Secondly, by using multiple fingers, a user can perform gestures and multiple users can simultaneously interact with the UI. Thirdly, as multiple users can be dispersed around the tabletop, the UI should allow con- tent orientation towards users. These interaction aspects have been considered widely in literature (for instance in [14] and [11]) and vari- ous multi-touch interaction software frameworks allow for these aspects (for instance [3] and [4]). These frame- works support a rapid prototyping approach that allows to quickly create design prototypes. Such design proto- types can then be evaluated in user tests rather than having to interpret designs based on descriptions. This is a promising approach for the creation of tabletop in- terfaces, where complex aspects like multi-touch gestu- ral interaction or collaboration of multiple users must be evaluated that seem difficult to test on a purely concep- tual basis. Additionally, as tabletop interfaces are not as common as WIMP interfaces, fewer best-practice ap- proaches for interface components and user interaction exist. Thus, for the design of interaction and UIs for tabletop interfaces the development of prototypes are crucial. However, the rapid prototyping tabletop frameworks introduced in section 2 suffer from several shortcom- ings. Firstly, there is no combined framework that en- ables prototyping of hybrid multi-touch and tangible user interfaces. Secondly, multi-touch gesture interaction often makes use of ’standard’ gestures like scale, pinch or rotate gesture to enlarge, shrink or rotate objects. Al- though, a user should be allowed to use as many fingers or even hands to perform such gestures, most frame- works constrain users to perform these gestures with two fingers only. Hence, a more general approach for these standard gestures is necessary. Additionally, the
Transcript
Page 1: TUIO AS3: A Multi-Touch and Tangible User Interface Rapid ...cvmr.info/resources/Paper/tuioAS32011.pdf · TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit

Sensyble Workshop 2010 manuscript No.(will be inserted by the editor)

TUIO AS3: A Multi-Touch and Tangible User Interface RapidPrototyping Toolkit for Tabletop Interaction

Johannes Luderschmidt1, Immanuel Bauer2, Nadia Haubner1, Simon Lehmann1, Ralf Dorner1,Ulrich Schwanecke1

1 RheinMain University of Applied Sciences, Wiesbaden Russelsheim Geisenheim, Department of Design, Computer Scienceand Media, Kurt-Schumacher-Ring 18, 65197 Wiesbaden, Germany

2 Vienna University of Technology, Institute for Design & Assessment of Technology, Human Computer Interaction Group,Favoritenstrae 9-11/3, 1040 Wien, Austria

Received: date / Revised version: date

Abstract Multi-touch and tangible input paradigmsprovide new opportunities for post-WIMP (Windows,Icons, Menus, Pointer) user interfaces. The developmentof such novel interfaces challenges designers and soft-ware developers alike. TUIO AS3 addresses these chal-lenges by providing a toolkit for the rapid developmentof multi-touch and tangible user interface (TUI) interac-tion. The TUIO AS3 toolkit comprises two basic func-tionalities. Firstly, it offers a sophisticated multi-touchand TUI interaction API. Secondly, to foster the de-velopment process, TUIO AS3 is capable to simulatecomplex multi-touch and TUI interaction via mouse andkeyboard. In terms of the interaction API, TUIO AS3provides a wrapper for the network-based open sourceprotocol TUIO in Adobe Flash’s programming languageActionscript 3 (AS3). TUIO AS3 allows to enhance userinterface (UI) elements developed in AS3 with interac-tivity for gestural and tangible interaction. The inter-action APIs support two kinds of interaction: On onehand, multi-finger/multi-hand controls for dragging, ro-tation and scaling can be added to any UI element toenhance it with standard multi-touch functionality. Onthe other hand, complex gestures can be defined with asimple grammar and tracked within TUIO AS3. In termsof simulation, TUIO AS3 offers means to simulate multi-touch and TUI interaction without the need for an ad-ditional simulator application. Aspects like multi-fingerflicking, multi-finger rotation and complex gestures aresupported via keyboard and mouse interaction. Tangi-bles can be added, manipulated and removed. TUIO AS3has been used and matured in several projects.

1 Introduction

Tabletop computing UIs that employ multi-touch andtangible user interaction differ from WIMP interfaces in

several aspects. Firstly, the interaction with the interfacecan be performed with multiple fingers and real-life ob-jects called tangibles in a TUI instead of using a mousepointer. Secondly, by using multiple fingers, a user canperform gestures and multiple users can simultaneouslyinteract with the UI. Thirdly, as multiple users can bedispersed around the tabletop, the UI should allow con-tent orientation towards users.

These interaction aspects have been considered widelyin literature (for instance in [14] and [11]) and vari-ous multi-touch interaction software frameworks allowfor these aspects (for instance [3] and [4]). These frame-works support a rapid prototyping approach that allowsto quickly create design prototypes. Such design proto-types can then be evaluated in user tests rather thanhaving to interpret designs based on descriptions. Thisis a promising approach for the creation of tabletop in-terfaces, where complex aspects like multi-touch gestu-ral interaction or collaboration of multiple users must beevaluated that seem difficult to test on a purely concep-tual basis. Additionally, as tabletop interfaces are notas common as WIMP interfaces, fewer best-practice ap-proaches for interface components and user interactionexist. Thus, for the design of interaction and UIs fortabletop interfaces the development of prototypes arecrucial.

However, the rapid prototyping tabletop frameworksintroduced in section 2 suffer from several shortcom-ings. Firstly, there is no combined framework that en-ables prototyping of hybrid multi-touch and tangibleuser interfaces. Secondly, multi-touch gesture interactionoften makes use of ’standard’ gestures like scale, pinchor rotate gesture to enlarge, shrink or rotate objects. Al-though, a user should be allowed to use as many fingersor even hands to perform such gestures, most frame-works constrain users to perform these gestures withtwo fingers only. Hence, a more general approach forthese standard gestures is necessary. Additionally, the

Page 2: TUIO AS3: A Multi-Touch and Tangible User Interface Rapid ...cvmr.info/resources/Paper/tuioAS32011.pdf · TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit

2 Johannes Luderschmidt et al.

design and use of complex gestures for special purposeslike a one finger down one finger move gesture shouldbe alleviated for developers. Thirdly, the actual softwaredevelopment process for tabletop interfaces is not con-sidered entirely: For instance debugging of interactioncode is difficult to achieve on actual tabletop setups asdebuggers are not designed to work with tabletop setups.Thus, the development of tabletop interfaces is usuallycarried out with specialized tools on a standard desktopcomputer [8]. Therefore, it is necessary to provide meansfor a developer to simulate multi-touch and tangible in-teraction on a mouse and keyboard workplace. Section2 introduces simulation applications and in-applicationsimulation for multi-touch and tangible interaction viamouse and keyboard that are integrated into existingframeworks. However, these tools neither support TUIinteraction nor the simultaneous manipulation of mul-tiple fingers in different directions that is necessary totest whole-hand and other complex gestures like rota-tion performed with multiple fingers.

Adobe Flash [1] is a platform for animated multime-dia applications for the Internet (via the browser-basedFlash Player) and for desktop computing (via AdobeAIR). Flash offers cross platform support, as runtimeenvironments are available for free for the major plat-forms Windows, Mac OS and Linux. Flash developmentin Actionscript aims at two communities: On one hand,for software developers Actionscript 3 provides an ob-ject orientated programming model similar to program-ming languages like Java. On the other hand, for de-signers Adobe provides a Flash Creative Suite applica-tion that offers WYSIWYG authoring functionality toprovide graphical means for the creation of Flash con-tent. Tabletop UIs are typically graphically rich inter-faces that are usually created by designers and/or soft-ware developers. Thus, the aspects that Flash is a plat-form for animated multimedia applications and that itstools are tailored for developers’ and designers’ needsalike seem to be an ideal combination for the creation oftabletop UIs. However, an interaction toolkit that allowsto use multi-touch and tangible interactivity in Flash ap-plications is necessary. Such a framework should as wellsupport the work of designers as of developers.

In section 2 we introduce related work. Section 3explains the multi-touch and tangible interaction rapidprototyping toolkit TUIO AS3. Section 4 presents exam-ple projects created with TUIO AS3 and discusses theresults. Finally, in section 5 a conclusion is given.

2 Related Work

There are several commercial and open source rapid pro-totyping and application frameworks that support thecreation of multi-touch and tangible user interfaces.

reacTIVision provides a tangible tabletop toolkit [6].It comprises the actual reacTIVision application that al-lows to optically track different kinds of fiducial markers,

which are attached to tangibles, and (multi-)touch inputand sends this data to applications via the TUIO proto-col [7]. However, reacTIVision does not address the cre-ation of actual UIs. It rather provides the necessary tech-nical foundations to build a tabletop setup that trackstangibles and touch input. Additionally, there is a re-acTIVision TUIO simulator application that allows tosimulate touch and fiducial interaction via mouse andkeyboard. However, simulated tabletop interaction is notcarried out directly in the application that should betested but in the external simulator. Thus, simulationcannot be performed spatially aware as a user can onlyguess where a touch or tangible interaction in the simu-lator is performed in the actual application.

PyMT is an open source cross platform multi-touchUI toolkit [4] written in Python. PyMT offers softwaredevelopers to quickly create application prototypes basedon multi-touch widgets. These widgets provide access todetailed touch input data and are used to recognize sev-eral two-finger gestures like drag, scale or rotate. Touchsimulation can be carried out in the actual application.However, PyMT does so far not support TUIs and realmulti-touch interaction can only be simulated by usingtwo mice (and then only two fingers can be moved si-multaneously).

MT4J (Multi-Touch for Java) resembles PyMT inmany ways: It offers an open source, cross platform frame-work for the rapid development of multi-touch applica-tions [3]. MT4J also provides UI widgets that offer ac-cess to detailed touch input data. The main differenceto PyMT is that MT4J is written in Java.

TISCH (Tangible Interactive Surfaces for Collabora-tion between Humans) is a cross-platform, cross-devicemultitouch development framework developed by Flo-rian Echtler [2]. Similar to PyMT and MT4J, it offersmultitouch widgets that in the case of TISCH are basedon OpenGL. Additionally, TISCH provides a reconfig-urable, hardware-independent gesture recognition engineand support for widely used (for instance move, scale, ro-tate) as well as custom gestures. Applications for TISCHcan be developed in C++. However, there are bindingsfor C#, Java, Python. TISCH has similar drawbacks asPyMT and MT4J.

Although, Adobe itself offers no TUI support so farfor Flash, there is an Adobe AIR multi-touch API thatprovides access to the Windows 7 multi-touch capabili-ties [9]. The AIR multi-touch API provides a low leveltouch event model and a gesture event model. This APIonly works in Windows 7 and only with multi-touchhardware that supports Windows 7 touch capabilities.The gesture event model supports only gestures providedby Windows 7. Thus, if gestural interaction should beused with the AIR multi-touch API either own gestureshave to be developed based on the low level API or thehigh level gestures of the multi-touch API can be usedthat however will not work with all Windows 7 multi-touch hardware.

Page 3: TUIO AS3: A Multi-Touch and Tangible User Interface Rapid ...cvmr.info/resources/Paper/tuioAS32011.pdf · TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit

TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit for Tabletop Interaction 3

GestureWorks [5] is a cross platform, commercial multi-touch framework for Adobe Flash and Flex. It featuresa gesture library with several built-in gestures. Gesturescan be used in combination with other Flash and Flex UIwidgets. Additionally to code-based development, Ges-tureWorks supports the creation of multi-touch appli-cations by designers in the Flash Creative Suite appli-cation. Thus, GestureWorks also aims for the creationof multi-touch applications by designers. The simulationtools for GestureWorks are similar to those of PyMT,MT4J and TISCH. Therefore, GestureWorks does notsupport the simulation of whole-hand gestures. Also,GestureWorks does not support the creation of TUIs.

3 TUIO AS3

TUIO AS3 has been developed for several reasons. Mostimportantly, no toolkit could be found that supportsrapid prototyping of multi-touch in combination withtangible user interfaces. TUIO AS3 is based on the AdobeFlash programming language Actionscript 3 (AS3). Ac-cordingly, TUIO AS3 has two target communities: soft-ware developers and designers. TUIO AS3 focuses oninteraction rather than to provide interaction widgets,as already rich media user interface widgets like imageand video panels are available for the Flash platformthat should be reusable in TUIO AS3. Hence, Flash andTUIO AS3 can be used in combination to create mul-timedia applications with multi-touch and TUI interac-tivity.

As the name states, the communication with the hard-ware is based on TUIO [7], which is usually supportedby open source tracking applications like Touchlib [10].Additionally, Windows 7 multi-touch capabilities [9] canbe used instead of TUIO.

TUIO AS3 provides a low level event model and call-back system for multi-touch and tangible interaction.However, TUIO AS3 offers high-level multi-touch andtangible interaction means in order to rapidly developprototypes. In terms of multi-touch interaction, TUIOAS3 supports as well standard interaction like dragging,rotation, scaling and flicking as complex, customized ges-ture interaction. In order that interaction can be per-formed with more than two fingers, TUIO AS3 allowsthe usage of whole hand interaction for standard interac-tion. For tangible interaction, callbacks can be registeredfor different tangibles in order to react when a certaintangible is put on the tabletop.

For development purposes, TUIO AS3 supports a so-phisticated simulation of multi-touch and tangible inputwith a mouse.

3.1 TUIO AS3 Architecture

As can be seen in figure 1, TUIO AS3 is based on mul-tiple tiers. Interaction on a tabletop system is recog-

Fig. 1 TUIO AS3 architecture

nized by the system’s tracker and sent to the TUIO AS3based application via TUIO. With TUIO AS3, TUIOconnections can be physically established via three pro-tocols: UDP, TCP or the Flash specific exchange for-mat LocalConnection (LC). TUIO messages are basedon OSC [12]. Hence, a TUIO connection is set up viaOSC and the correct network protocol adapter to thetracker. TUIO messages being sent via this connectionare handled by the OSC manager and forwarded to theTUIO tier that feeds these messages to the event man-agement tier. Depending on how the message is a touchor a fiducial message, an appropriate TouchEvent orFiducialEvent will be created and dispatched to the ac-tual application or to the high level gesture tier. In-stead of using a TUIO connection, TUIO messages canbe simulated via mouse with the MouseSimulation mod-ule or via Windows 7 multi-touch capabilities with theWindows7Touch module. However, in TUIO AS3 Win-dows7Touch is implemented as TUIO message providerwhich is transparent to the developer. Thus, an applica-tion based on TUIO AS3 can be used with each TUIOmessage provider, each Windows 7 system based on touchhardware and with mouse. However, Windows 7 doesnot support TUI interaction and mouse simulation can-not provide the directness of touch interaction and realmulti-touch interaction with a mouse is not possible.

From a development point of view, working with multi-touch differs significantly from working with mouse events,as there can be multiple touches simultaneously. Hence,an multi-touch application cannot simply listen on somekind of touch down event and afterwards on a touchmove event, as both events may belong to different touches.If a touch down event is performed on a user interface el-ement the element must remember the touch event id inorder to figure out if subsequent touch events like moveor up belong to the same event. To simplify touch event

Page 4: TUIO AS3: A Multi-Touch and Tangible User Interface Rapid ...cvmr.info/resources/Paper/tuioAS32011.pdf · TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit

4 Johannes Luderschmidt et al.

Fig. 2 TUIO AS3 callback system

handling, a UI element can register a callback for a touchid (displayed as TouchCallbacks in figure 1, illustratedin more detail in figure 5 in the upper half) after theinitial touch down happened. Every time, the touch willbe moved or eventually removed the appropriate call-back function will be notified. It will be shown later inthis paper that this simplification goes not far enoughin terms of multi-touch interaction as the handling ofmultiple touches can be a tedious task for developers.

For tangible interaction, a similar approach is possi-ble: a UI element can listen on a fiducial event and canreact on it. Additionally, TUIO AS3 supports a globalcallback system in which to every type of tangible call-back handlers can be registered (displayed as Fiducial-Callbacks in figure 1, illustrated in more detail in figure5 in the lower half). As soon as a certain tangible is put,moved, rotated or removed on the table, the appropriatecallback function will be called.

3.2 Multi-Touch Interaction

Complementary to the event and callback system, TUIOAS3 provides a high-level interaction API that allowsto enhance user interface elements with standard multi-touch interactivity like dragging, rotation and scalingwith just one line of code. Additionally, it is possible todefine and track custom defined gestures like two fingertaps or one finger down one finger move gestures. Thesegestures can be defined with a simple gesture grammar.Both kinds of interactivity are introduced in the follow-ing.

3.2.1 Standard Gesture Interactivity Drag, scale androtate gestures are used frequently for gestural interac-tion with multi-touch user interfaces. TUIO AS3 pro-vides simple means to provide developers and designersto enhance user interface elements with this interactionmetaphors.

Usually, in multi-touch gesture frameworks a draggesture is performed by putting exactly one finger ontop of an UI element and moving the finger around.Scale and rotate gestures are usually implemented astwo finger gestures: For a rotate gesture two fingers on

Fig. 3 Standard Gesture Interactivity

top of an UI element are turned clockwise and counter-clockwise and for a scale gesture two fingers are movedtowards each other (for shrinking) or away from eachother (for enlargement). However, this kind of implemen-tation does not support interaction that is semanticallyrelated: For instance a user could use whole-hand ges-tures and put all fingers of left and right hand on topof an image to move the image around or to modify theimage’s scale. To make standard interaction as flexibleas possible, TUIO AS3 makes use of the barycenter of alltouches that prevail on an UI element. Flash is frame-based and the barycenter cj of all touches in frame j iscalculated as follows where tj

i ∈ R2 are the nj touchesin the frame j:

cj =1nj

nj∑i=1

tji

Dragging (or translation) vj in the frame j can becalculated as the distance of cj between successive framesj − 1 and j:

vj = cj − cj−1

Scaling sj ∈ R in frame j is calculated relativelybetween successive frames j − 1 and j in relation to theaverage length of the vectors uj

i = tji − cj

sj =sj

sj−1

where

sj =1nj

nj∑i=1

‖uji‖

Page 5: TUIO AS3: A Multi-Touch and Tangible User Interface Rapid ...cvmr.info/resources/Paper/tuioAS32011.pdf · TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit

TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit for Tabletop Interaction 5

Accordingly, rotation αj ∈ [0, 2π) can be identifiedby calculating the difference of the angle between (1, 0)T

and the vector mj = (xj , yj)T in two successive framesj − 1 and j:

αj = αj − αj−1

where

αj = arccos(

xj

‖mj‖

)and

mj =1nj

nj∑i=2

(tji − tj

1)

Figure 3 exemplifies the flexibility of the standard in-teraction model. The first example illustrates two touchesr1 and r2 that perform a standard two-finger gesturewith the barycenter R between them. If both fingersare placed on top of an UI element, dragging can beperformed by moving both fingers at once in the samedirection. Scaling and rotation can be carried out withthe usual two-finger gestures. The second example showswhole-hand interaction: By placing a whole hand on anUI element, the element can be dragged by moving thehand, scaled by spreading the fingers and rotated byturning the hand. The third example demonstrates two-handed interaction: Additionally to the barycenter be-tween the hands, the barycenters of the single hands havebeen drawn for illustration purposes. Similar to two-finger interaction, dragging can be performed by mov-ing both hands at once, scaling by moving both handsaway from or towards each other and rotation by mov-ing both hands in a circular fashion. However, it is alsoconceivable that a user performs a rotation gesture byusing the index and middle finger of both hands. It doesnot matter how many finger or hands are used as longas the fingers perform the basic gesture.

From a development point of view, standard ges-ture interactivity can be used with every UI elementthat uses the TouchControl provided by TUIO AS3. ATouchControl can be added to a UI element with oneline of code:

var touchControl:TouchControl =new TouchControl(this);

If TUIO AS3 would be based on a purely event drivenmodel (for instance if drag, scale and rotate events wouldbe globally dispatched on UI elements) TUIO AS3 wouldneed to calculate every potential kind of gestural inter-action for every combination of touches in the applica-tion for every UI element. Hence, TouchControl admin-istrates a touch list for each UI element that uses it,which circumvents the necessity to globally calculate po-tential interaction and only local interaction is detected.

Fig. 4 Example for Complex Custom Gesture Interactivity[13]

3.2.2 Complex Custom Gesture Interactivity TUIO AS3provides a GestureManager on which gestures can beregistered that should be tracked globally. Figure 4 ex-emplifies a few gestures that can be custom built: Num-ber 1 shows a a one still one move, Number 2 a two fingertap and Number 3 two finger swipe gesture.

A simple grammar allows to define own gestures: Allthat needs to be done is to declare the order of eventsfor instance for a three finger move gesture:

1. Event: TOUCH MOVE, containerAlias:’A’2. Event: TOUCH MOVE, containerAlias:’B’3. Event: TOUCH MOVE, containerAlias:’C’4. Event: TOUCH UP, containerAlias:’A’, die:true5. Event: TOUCH UP, containerAlias:’B’, die:true6. Event: TOUCH UP, containerAlias:’C’, die:true

The first three steps are mandatory: Three different touchesmust be placed on the application. If any of steps 3 to 6happens the gesture will end.

To alleviate the task to create own gestures, abstractbase gesture classes are provided that allow to base ges-tures upon: For instance, a two finger move gesture isprovided on which a two finger rotate or scale gesturecould be based upon.

3.2.3 Naıve Physics Nave physics properties can be addedto an UI element by adding a FlickControl to it thatuses momentum as a physical behavior. This means thatan element keeps floating after it has been released ac-cording to the momentum, which depends on its velocityat the time of release. Thus, an UI element can be flickedon the surface.

3.3 TUI Interaction

For tangible interaction, two approaches are possible:On one hand, a user interface element can listen on fidu-cial events and can react appropriately on them. Thismeans that if any tangible has been placed on top of aUI element, a FiducialEvent will be dispatched ontothe UI element that contains the fiducial id (a fiducialid stands for a certain type of tangible), the position andthe rotation of the tangible. On the other hand, whenworking with tangibles, a global approach could makemore sense: The application itself could want to react

Page 6: TUIO AS3: A Multi-Touch and Tangible User Interface Rapid ...cvmr.info/resources/Paper/tuioAS32011.pdf · TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit

6 Johannes Luderschmidt et al.

Fig. 5 Development workflow for multi-touch software [8]

on different tangibles in different ways. If for instance atangible with the shape of a keyboard is placed on thetabletop a keyboard should appear next to the tangi-ble. Hence, in TUIO AS3 it is possible to register globalcallbacks for different fiducial ids. As soon as a tangiblewith a fiducial marker with an id that is registered as acallback is put on the tabletop, the appropriate globalcallback function is called (see figure 5 in the lower half).

3.4 Development

Khandkar et al [8] illustrates the development workflowof tabletop applications as illustrated in figure 5: Theactual development of the application is carried out ona desktop computer where the results are tested and de-bugged with the help of a simulator application. As soonas a developer is confident that the results of the develop-ment are working the code/application is transferred tothe tabletop system where the application is tested anddebugged in the actual setup. Remarkably, only smalldevelopment changes are performed on the tabletop sys-tem. For bigger changes the results are transferred backto the desktop computer.

3.4.1 Tools Applications for TUIO AS3 can be devel-oped with common Flash tools like Adobe Flash Builderor Adobe Flash CS5.

3.4.2 Simulation As testing is mainly carried out on aworkspace computer, it is important to offer a simulationof touch and tangible interaction as complete and intu-itive as possible that is tailored to mouse and keyboardinteraction. To make interaction as direct as possible,TUIO AS3 provides in-app simulation. This means thatthere is no need for an external simulator application.

For single touch simulation, users can simply clickand drag the mouse in the application. Figure 6 shows asimulated touch represented by a grey circle with addi-tional textual debug information in the upper left. Multi-touch simulation can be achieved via shift-clicking: If theshift key on the keyboard is pressed while the mouse is

Fig. 6 Screenshot of simulation information in a TUIO AS3based application

clicked the touch will stay in the application even af-ter the mouse has been released making the touch per-manent. Afterwards a permanent touch can be draggedaround. In order to move multiple touches simultane-ously, touches can be grouped by pressing the ctrl keywhile clicking on them. A grouped touch is marked witha red dot. Figure 6 illustrates five grouped touches thatrepresent a whole-hand gesture. If a touch is draggedthat belongs to a group the whole group of touches willbe moved altogether. However, this approach only allowsthe synchronized movement of all touches in one direc-tion. In order to move touches in different directions forinstance to simulate whole-hand interaction like scalingor rotation, additional keys can be used: If the s key ispressed on the keyboard while moving a group of touchesthe touches will be moved perpendicular to the barycen-ter of the touches causing a scale gesture. If the r key ispressed the touches will be rotated around the barycen-ter causing a rotation gesture. To simulate the testingof a flicking gesture after real multi-touch interaction,pressing the space bar while moving a group of touchesreleases all touches at once.

To simulate TUI interaction, a tangible can be addedto an application by right clicking with the mouse andchoosing a fiducial id from the context menu. A tangibleis represented by a rectangle (see figure 6 on the right).After clicking the right mouse button, tangible objectscan be added to the application by choosing the appro-priate fiducial id from the context menu. A TUI objectcan be dragged around. By pressing ”r” while draggingaround a tangible object a tangible object will be rotatedaround its center. Shift clicking a tangible removes it.

As TUIO AS3 also supports Windows 7 touch ca-pabilities, simulation can also be carried out on multi-touch hardware that supports this API. Meanwhile, abroad spectrum of Windows 7 multi-touch hardware isavailable.

Page 7: TUIO AS3: A Multi-Touch and Tangible User Interface Rapid ...cvmr.info/resources/Paper/tuioAS32011.pdf · TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit

TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit for Tabletop Interaction 7

Fig. 7 Screenshot of the flicking testing application

Fig. 8 Screenshot of the InfinityBank application

4 Examples and Discussion

4.1 Examples

Several projects have been developed based on TUIOAS3. Amongst others an application that tests flickingbehavior has been created (see figure 7). This applicationuses the TouchControl as well as FlickingControl to pro-vide the interaction that should be tested. Additionally,the application enhances TUIO AS3’s naıve physics witha snapping behavior control. This means that a touch-able object can be enhanced with the property to attractother touchable objects that enter the surrounding of theobject. This example shows how the interaction API ofTUIO AS3 can be used to use basic interactivity in ad-dition with the creation of new interactivity.

The InfinityBank project is another example for aTUIO AS3 based application (see figure 8). It can beused by a bank counselor to plan a customer’s retire-ment provisions collaboratively with the customer itself.Amongst others, InfinityBank demonstrates how sophis-ticated visualization widgets can be used in combinationwith TUIO AS3’s interaction API.

4.2 Discussion

TUIO AS3 offers a toolkit for rapid prototyping for table-top interaction. Additionally to an architecture for thetranslation of TUIO messages into an event and call-back system, TUIO AS3 offers a high level interactionAPI with support for gestural and tangible interactionand development and simulation tools. Both will be dis-cussed in the following.

4.2.1 Interaction The gestural interaction API for multi-touch supports two approaches: On one hand interactionwith the standard gestures drag, scale, rotate and flickare supported. On the other hand, there is a complexgesture API that can be enhanced by a developer.

While both interaction models are necessary, onlyone interaction model can work with one UI element ata time. If for instance an UI element is draggable, scal-able and rotatable via the standard interaction API it iscurrently not possible to apply a complex gesture like athree finger scroll gesture to it. The element would inter-pret it as a dragging gesture because it would be sensedas a drag gesture by the standard interaction API thatcalculates any number of fingers as a drag, scale androtate gesture in order to allow a user every gestureapproach as standard interaction. In future implemen-tations it is conceivable that the amount of fingers forthe performing of standard gestures can be constrainedby the developer in order to allow additional complexgestures on an UI element.

The TUI API allows a developer to use tangible in-teraction either with an event model or with a callbacksystem. The event model might be more suitable for localinteraction, as the event is directly dispatched onto theUI element underneath the tangible. Thus, tool tangi-bles are conceivable like a color correction tangible thatcan be placed and used directly on top of an image. Thecallback system could be applied for tabletop-wide in-teraction: A callback can register for a certain tangiblewhen the application starts in order to provide a globalreaction when the appropriate tangible is placed on thetabletop. This could be used for instance for a globalUI element like a semantic magnet that attracts all redelements.

4.2.2 Development and Simulation Multi-touch simu-lation via mouse enhances existing approaches with newconcepts like real simultaneous multi-touch interaction.Thus, also gestures like scaling and rotation are sup-ported by allowing to move grouped touches in differentdirections at once by combining keyboard and mouseinput.

However, currently only one group of touches is sup-ported. If for example two-handed interaction should besupported at least two groups of touches would be neces-sary that needed to be moved at once. The actual multi-touch input differs between mouse simulation and realmulti-touch interaction: On a multi-touch setup touchesare constantly moving a bit, no two gestures are per-formed exactly in the same way and gestures are notcarried out as precise as with mouse and keyboard sim-ulation. For instance with TUIO AS3’s standard gestureimplementation a rotation gesture cannot be performedwithout additionally causing a weak scaling gesture andwithout causing a serious dragging gesture. With thecurrent simulation every gesture is simulated separatelyfrom each other. One approach could be to add some

Page 8: TUIO AS3: A Multi-Touch and Tangible User Interface Rapid ...cvmr.info/resources/Paper/tuioAS32011.pdf · TUIO AS3: A Multi-Touch and Tangible User Interface Rapid Prototyping Toolkit

8 Johannes Luderschmidt et al.

kind of fuzziness to the mouse simulation input. Alterna-tively, standard and complex gestures could be recordedon a multi-touch setup and replayed in the simulatorfor testing purposes. For instance, if developers wantedto test a rotation gesture on a UI element, they wouldchoose ”rotate gesture” from a menu and one rotate ges-ture from a set of recorded rotate gestures would be per-formed on the element.

Tangible simulation via mouse has not been possibleso far and allows to test basic tangible interaction likedragging and rotation of tangibles. Currently, the addingof tangibles is rather technical: To add a tangible to arunning application, the appropriate fiducial id of thetangible must be chosen from a context menu. As dif-ferent tracking systems allow to use a lot of fiducial ids(for instance reacTIVision supports up to 216 fiducialsby default) choosing the appropriate id from the list cantake a while. Additionally, a developer has to know thefiducial id to every tangible of the application. As thecallback system for tangible interaction is conceived as asystem where each kind of tangible has its own callbackclass, it would make sense to list the callback names inthe context menu to which fiducial ids are registered inthe system in order to provide quick access to relevanttangibles with a meaningful name.

5 Conclusion

TUIO AS3 offers a toolkit for the rapid rapid develop-ment of multi-touch and tangible user interface (TUIs)interaction in Adobe Flash. TUIO AS3’s architecturesupports TUIO messaging, the Windows 7 touch ca-pabilities and mouse interaction. The toolkit comprisesAPIs for multi-touch and tangible interaction and multi-touch and tangible simulation via mouse and keyboardfor development purposes. TUIO AS3’s multi-touch APIcomprises a low level event- and callback-based interac-tion API and a high level gestural interaction API thatsupports standard gesture like drag, scale and rotate ges-tures and complex gestures like three finger scroll ges-tures or one finger down one finger move gestures. Thestandard gesture system calculates the appropriate in-teraction of an arbitrary amount of fingers on an UI ele-ment. Thus, additionally to common two-finger gestures,also whole-hand and two-hand interaction can be per-formed to drag, scale and rotate an UI element. The com-plex gesture system already supports several gesturesbut it can also be enhanced easily by a developer witha simple grammar or by enhancing existing gestures.The TUI API supports an event-based and a callback-based development of tangible interaction. Multi-touchand TUI input can be simulated with mouse and key-board directly in a TUIO AS3 based application withoutthe need for an additional simulator application. TUIOAS3 allows to simulate whole-hand rotation and scal-ing gestures by moving different touches simultaneously

in different, appropriate directions. TUI simulation sup-ports standard manipulation like dragging and rotationof tangibles.

References

1. Adobe Systems. Animation software, multime-dia software — Adobe Flash Professional CS5.http://www.adobe.com/products/flash.

2. Echtler, F. Library for Tangible Interac-tive Surfaces for Collaboration between Humans.http://tisch.sourceforge.net/.

3. Fraunhofer-Institute for Industrial Engineer-ing. MT4j - Multitouch for Java. http://www.mt4j.org.

4. Hansen, T. E., Hourcade, J. P., Virbel, M., Patali,S., and Serra, T. PyMT: a post-WIMP multi-touchuser interface toolkit. In Proceedings of the ACM Inter-national Conference on Interactive Tabletops and Sur-faces (New York, NY, USA, 2009), ITS ’09, ACM,pp. 17–24.

5. Ideum. GestureWorks - Multitouch Authoring for Flashand Flex. http://gestureworks.com.

6. Kaltenbrunner, M., and Bencina, R. reacTIVision:a computer-vision framework for table-based tangible in-teraction. In TEI ’07: Proceedings of the 1st internationalconference on Tangible and embedded interaction (NewYork, NY, USA, 2007), ACM, pp. 69–74.

7. Kaltenbrunner, M., Bovermann, T., Bencina, R.,and Costanza, E. TUIO - A Protocol for Table BasedTangible User Interfaces. In Proceedings of the 6th In-ternational Workshop on Gesture in Human-ComputerInteraction and Simulation (GW 2005) (2005), pp. 1–5.

8. Khandkar, S. H., Sohan, S. M., Sillito, J., andMaurer, F. Tool support for testing complex multi-touch gestures. In Proceedings of the ACM InternationalConference on Interactive Tabletops and Surfaces (2010).

9. Kiriaty, Y. MultiTouch Capabilities inWindows 7. http://msdn.microsoft.com/en-us/magazine/ee336016.aspx, 2009.

10. nuigroup. Touchlib - Home, October 2008.11. Shen, C., Ryall, K., Forlines, C., Esenther, A.,

Vernier, F. D., Everitt, K., Wu, M., Wigdor, D.,Morris, M. R., Hancock, M., and Tse, E. Informingthe Design of Direct-Touch Tabletops. IEEE ComputerGraphics and Applications 26, 5 (2006), 36–46.

12. Wright, M., Freed, A., and Momeni, A. OpenSoundControl: state of the art 2003. In NIME ’03: Proceedingsof the 2003 conference on New interfaces for musical ex-pression (Singapore, Singapore, 2003), National Univer-sity of Singapore, pp. 153–160.

13. Wroblewski, L. Lukew - touch gesture reference guide.http://www.lukew.com/ff/entry.asp?1071, 2010.

14. Wu, M., and Balakrishnan, R. Multi-finger andwhole hand gestural interaction techniques for multi-usertabletop displays. In UIST ’03: Proceedings of the 16thannual ACM symposium on User interface software andtechnology (New York, NY, USA, 2003), ACM, pp. 193–202.


Recommended