+ All Categories
Home > Documents > Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile...

Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile...

Date post: 14-Feb-2018
Category:
Upload: lamnhan
View: 213 times
Download: 1 times
Share this document with a friend
72
Visual Effects Management in a Mobile User Interface Andreas Larsson [email protected] Alexander Klintstr¨ om [email protected] April 10, 2006 Master’s Thesis in Computing Science, 2*20 credits Supervisor at CS-UmU: Pedher Johansson Examiner: Per Lindstr¨ om Ume ˚ a University Department of Computing Science SE-901 87 UME ˚ A SWEDEN
Transcript
Page 1: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Visual Effects Management ina Mobile User Interface

Andreas [email protected]

Alexander [email protected]

April 10, 2006Master’s Thesis in Computing Science, 2*20 credits

Supervisor at CS-UmU: Pedher JohanssonExaminer: Per Lindstrom

Umea UniversityDepartment of Computing Science

SE-901 87 UMEASWEDEN

Page 2: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen
Page 3: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Abstract

This thesis investigates how visual effects can be handled in a mobile graphical systemand in what way the utilization of visual effects can be simplified. Potential problems areidentified and possible solutions are presented. Continuing with exploring the possibilityof having visual effects and transitions in the user interface, managed on a high abstractlevel, using an event driven architecture controlling visual manipulation of graphicalobjects. How the visual manipulation is carried out is controlled by an XML interface,events on certain objects are mapped to visual effects actions, also defined in XML. Theproposed architecture is a client/server-based approach where clients can load XML filesdescribing visual behavior and then trigger events on objects when their state changes.The XML interface is designed with great focus on simplicity, with designers as targetusers, the syntax is kept similar to the terminology that designers use.

Page 4: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

ii

Page 5: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Acknowledgments

We would like to thank our external supervisor, Peter A Nilsson at Sony Ericsson,for making this very interesting thesis possible and also for all the support and ideascontributed along the way. Thanks to our internal supervisor, Pedher Johansson atUmea University for helping with our report.Also, a big thanks to Srdan Boskovic and everyone else at the Graphics department atSony Ericsson for being helpful and contributing to interesting discussions at the coffeebreaks.

Lund, Mars 20, 2006Alexander Klintstrom

Andreas Larsson

iii

Page 6: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

iv

Page 7: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Contents

1 Introduction 11.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Objectivs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Gain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.4 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Mobile Graphics 52.1 Mobile User Interfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.2 Compositor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.2.1 Quartz Compositor . . . . . . . . . . . . . . . . . . . . . . . . . . 72.2.2 X Composite Extension . . . . . . . . . . . . . . . . . . . . . . . 82.2.3 Desktop Window Manager . . . . . . . . . . . . . . . . . . . . . . 82.2.4 OpenGL ES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.3 Vector Graphics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.3.1 SVG Tiny . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.3.2 OpenVG . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3 Visual Effects Engine 133.1 Visual Effects Today . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.1.1 Realization Process . . . . . . . . . . . . . . . . . . . . . . . . . . 143.1.2 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.2 Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153.2.1 Fundamental Requirements . . . . . . . . . . . . . . . . . . . . . 153.2.2 Graphical System Overview . . . . . . . . . . . . . . . . . . . . . 163.2.3 Involved Units . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173.2.4 Case studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

3.3 Architectural Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203.3.1 Client Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213.3.2 XML Parser . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253.3.3 Internal Data Representation . . . . . . . . . . . . . . . . . . . . 253.3.4 Internal Structure . . . . . . . . . . . . . . . . . . . . . . . . . . 25

v

Page 8: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

vi CONTENTS

3.4 Client Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273.4.1 Quick Start . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273.4.2 User Interface Integration . . . . . . . . . . . . . . . . . . . . . . 293.4.3 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

4 Visual Effects Markup Language 334.1 XML . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 344.2 FXML Tags . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

4.2.1 fxml . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354.2.2 include . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354.2.3 object . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364.2.4 event . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364.2.5 effect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364.2.6 defineEffect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374.2.7 position . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374.2.8 scale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384.2.9 opacity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394.2.10 positionKeyframe . . . . . . . . . . . . . . . . . . . . . . . . . . . 404.2.11 scaleKeyframe . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414.2.12 opacityKeyframe . . . . . . . . . . . . . . . . . . . . . . . . . . . 424.2.13 defineConstants . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424.2.14 constant . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

5 Result and Conclusion 435.1 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 445.2 Further Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

References 45

A Acronyms 47

B IVisualEffectsEngine.idl 49

C VisualEffectsEngine types.idl 53

D Visual Effects DTD 57

Page 9: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

List of Figures

1.1 Triangular relationship among involved units and their exchange of infor-mation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2.1 A simple illustration of the compositor approach in a window system. . 62.2 The compositor approach as it is used in a modern window system. . . . 72.3 SVG Tiny is a subset of SVG Basic, and SVG Basic is a subset of SVG. 10

3.1 Illustration of the layering in the graphics system. . . . . . . . . . . . . 163.2 The Pop-up user interface object. . . . . . . . . . . . . . . . . . . . . . . 183.3 The Top Menu user interface object. . . . . . . . . . . . . . . . . . . . . 193.4 System overview with the new architecture in place. . . . . . . . . . . . 203.5 The three main parts of the architecture. . . . . . . . . . . . . . . . . . 213.6 UML diagram over the internal IDL structure of the architecture. . . . . 263.7 A pop-up sliding in from above and out below. . . . . . . . . . . . . . . 303.8 A Top Menu explode animation. . . . . . . . . . . . . . . . . . . . . . . 31

4.1 Different anchor point for the scale base effect. . . . . . . . . . . . . . . 39

vii

Page 10: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

viii LIST OF FIGURES

Page 11: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Chapter 1

Introduction

Mobile devices are becoming more and more demanding when it comes to originalityand uniqueness of user interfaces[8, 13]. A user interface can become more uniqueand original by manipulating appearance, in some mobile devices this manipulation isachieved with themes. Themes is a common thing and sufficient in case of making amobile device personal, but when it comes to full customization (manipulation of bothappearance and behavior), themes is not enough and needs to be extended in some way.

The goal of this thesis is to define an architecture for event driven visual effects, thatcan be used in a mobile phone user interface. This new architecture should then be ableto improve customization of user interfaces and simplify the realization process of visualeffects.

1.1 Background

Today, mobile phones have enough computing power to perform advanced visual ma-nipulation of graphical objects. This opens the possibility of letting objects in the userinterface not only look, but also behave in a certain way. Several different techniquesand customization processes are being investigated and used to fulfill requirements onan unique and original user interface; from the basic themes concept to flash and 3Duser interfaces[11].

When examining the user interface of currently available mobile phones from Sony Er-icsson Mobile Communication[1] (SEMC), visual effects are already a reality. However,the current management of visual effects is rather static and to alter an existing visualeffect is quite demanding, as source code must be rewritten. It would be desirable tofind a way to eliminate the need to alter source code and by doing so, improving thedevelopment process.

Another problem with the management of visual effects is that it only applies to thebuilt-in user interface system and not to the whole graphical system. This means thatother graphics related technologies that live inside the mobile phone does not get the

1

Page 12: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

2 Chapter 1. Introduction

same behavior. Examples of such technologies are web browsers, Java[17], Flash[9]and media players. This lack of overall picture may create discrepancy between thecomponents where quality of some graphical components are very high, but when movingsomewhere else in the application flow, the quality deteriorates radically.

It is therefore interesting to find a way to relate graphical components among applica-tions that uses different technologies. Besides relating the components to one anotherit is equally important to be able to show that an object has entered certain state. Forinstance, in case of two input objects presented on the screen at the same time, it wouldbe desirable to have some kind of graphical notification when interaction focus shiftsfrom one input object to another.

Advantages of relating components are several: Smoother transition among the com-ponents (for instance, fade-in effect on flash top menu component when entering fromscreen idle mode that has a Java applet running in background), visual effects on activegraphical components that may improve the usability (for instance, visual delete effectinstead of text feedback message that is currently used in products when deleting a listelement).

1.2 Objectivs

Some mobile phones are already equipped with Graphics Processing Units (GPUs) andit can be expected that, in a near future, the use of GPUs in mobile phones will increasealong with the need to run demanding applications. Not only does GPUs makes itpossible to have hardware accelerated 3D graphics, it also enables for accelerating theuser interface. Making it feasible to do more advanced graphical manipulation withoutincreasing the load on the Central Processing Unit (CPU). A GPU will in fact relievesome of the workload from the CPU and let it concentrate on other things. However,there are still some problems with current mobile GPUs, making them less usable in theuser interface[11].

Along with graphical power, new demands arise for a more complex graphical system,containing more visual effects. These demands will create dependencies among designers,user interface architects and graphics developers, which are the groups of people behindcreating and realizing visual effects. These three groups must agree on what, when andhow visual effects are managed.

This report will refer to these groups as units called Design, User Interface (UI) andGraphics. Relationships among these three units can be seen as a triangular relationshipas illustrated in Figure 1.1.

The information flow among the three units can be described in the following way.Design need to be handed an interface that defines what they can do with the graphicsand when. The what-part needs to be agreed on by finding out what the graphics cando and what Design would like to be able to achieve. Next, Design need to know whenthey can invoke these effects. This part of the interface will be defined by finding outwhat information UI can deliver and what information Design would like to use. Finally,Graphics needs to know what effect to perform and where. This information needs tocome from Design and UI respectively.

Page 13: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

1.3. Gain 3

when

whatGRAPHICS DESIGN

UI

how

EngineVisual Effects

Figure 1.1: Triangular relationship among involved units and their exchange of infor-mation.

This thesis project aims to add a fourth part, right in the middle of the triangle, asalso illustrated in Figure 1.1, making the communication smoother among Design, UIand Graphics. What the different units can expect to gain from this new architectureis discussed in Section 1.3.

Once the interface design is agreed upon by investigating requirements and abilities ofthe involved units, a prototype should be implemented to demonstrate the functional-ity. This implementation covers an interface towards designers, in Extensible MarkupLanguage (XML) format, that defines visual effects on graphical components. It alsoincludes implementing an interface towards the user interface and other potential clients.All communication towards the graphics will pass trough existing frameworks, so thereis no need to implement an interface.

Eventually, the work put down in this project, could improve usability and customizationof mobile user interfaces in the future.

1.3 Gain

Here follows some thoughts on what the units, described earlier, can expect to gain fromthis new architecture.

– UI, should be relieved from code and logic directly associated with visual effects.They should also not have to rewrite code when demands for new effects arrives,only when new object and events are requested.

– Graphics, is able to provide a higher abstraction to visual effects and the codeassociated with visual effects can live inside the unit. The unit can easily providenew effects on existing objects, only an agreement between Design and Graphicsis needed.

– Design, will be handed a tool to define visual effects on objects. They will becomemore involved in the actual creation of visual effects and the tool can be referredto, when requesting objects, events and base effects. Designers will immediatelybe able to see the visual effects and how the performance is affected on the mobilephone.

Page 14: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

4 Chapter 1. Introduction

1.4 Outline

Chapter 2 presents a survey on mobile graphics that is used on handheld devices withfocus on mobile user interfaces and how it can be compared to a desktop computerenvironment. The concept with having a compositor is explained and the possibleconnections to graphics hardware. Then there are some thoughts about vector graphicsand how it can be used for creating a better user interface. Chapter 3 describes therealization of the Visual Effects Engine architecture, from idea to a working prototype.This architecture is programmable through a XML interface and is supposed to handlevisual effects and transitions in a mobile graphical system with the help of an animationinterface towards a compositor. Continuing with Chapter 4, presenting the Visual EffectsMarkup Language (FXML) language, which is used together with the Visual EffectsEngine to define the visual effects. Chapter 5 presents the results achieved with thisproject and summarizes our final conclusions about the project.

Page 15: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Chapter 2

Mobile Graphics

This chapter presents mobile graphics techniques which can be used for creating a richeruser interface on small handheld devices.

Mobile graphics are heading much in the same direction as desktop computer graph-ics. The compositor approach which is currently breaking its way into all the populardesktop systems is already being exploited on mobile devices[11]. This approach opensup opportunities to utilize graphics hardware, that was first intended for acceleratinggames and use it to enrich the user interface. Just as games and multimedia has driventhe evolution of desktop graphics technologies, the same is currently happening withgraphics on mobile devices, but at a much higher rate.

Another hot topic in desktop computer graphics is the breakthrough of using vectorgraphics in the user interface. There are several advantages with vector graphics, itmakes better use of the pixels as it is scalable and can be rendered to any size with thebest possible quality. The advantages becomes even more significant in a mobile usesinterface where the limit in screen resolution makes the pixels more valuable.

When new standards for mobile graphics are prepared, much of the work has alreadybeen done for desktop computers and there is often just a matter of stripping downexisting desktop computer standard, making them fit the smaller mobile footprint[15].In this way, things that turned out to be bad solutions in the existing desktop standardcan be left out in the mobile version, thereby learning from previous mistakes.

2.1 Mobile User Interfaces

One big difference between mobile and desktop graphics is resolution and screen size,where mobile devices have the physical limitation of always being as small as possibleand the fact that desktop screens have become larger with time only makes this differenceeven larger. This has influence on how user interfaces are designed on mobile devices incomparison with desktop systems. For instance, in a mobile user interface, applicationstend to take up the entire screen area, as most of them run in fullscreen, oppose to an

5

Page 16: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

6 Chapter 2. Mobile Graphics

application in a desktop environment where it often is up to the client to decide the sizeof the application and if it should be in fullscreen or not.

This difference becomes important when discussing visual effects in mobile user inter-faces, as visual effects becomes more desirable within the application and not on theapplication itself. Visual effects are also an important feature in mobile user inter-faces, as it can compensate for a small screen, by increasing the understanding of howinteraction is carried out in the user interface[11].

2.2 Compositor

Traditionally, graphical window systems lets their clients draw its content directly tothe display frame buffer. Where and when the clients should draw was controlled bythe window system with assistance from a window manager. The exact role of thewindow manager differs among different window systems. In any case, the clients getsnotified when its window or some area of it needs to be redrawn, the client then drawsits content, erasing whatever were there before. This makes it very hard to enrich theuser experience with transparency and visual effects between windows. It is still possibleto do some transparency and visual effects but it gets complicated when clients mustbe aware of the rendering mechanism. True transparency is however not possible aswindows occluded by a non fully opaque window does not get updated.

With the compositor approach, introduced in recent desktop window system[5, 4, 18],clients draw their content to off-screen pixmaps in the window system. The compositor,which is a part of the window system, then uses these pixmaps and draws them to thedisplay frame buffer as illustrated in Figure 2.1.

Window System

Client

Client

Pixmap

PixmapCompositor Display

Figure 2.1: A simple illustration of the compositor approach in a window system.

By adding an extra alpha channel to every off-screen pixmap, describing per-pixel trans-parency, it is possible to have windows in all possible shapes and levels of transparency.How this information is used in the compositing stage is formally defined by ThomasPorter and Tom Duff in 1984 [14]. Typically, the rendering is carried out in back tofront order and all windows are rendered using the “over” operator:

Cresult = Cunder · (1 − αover) + Cover · αover

By having a control interface to the compositor, it is possible to do advanced per-pixel manipulation and transformations in the compositing stage. The only limitationson what effects that can be carried out is the fact that visual effects are very CPUconsuming. However, the compositor approach fits perfectly to the operations provided

Page 17: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

2.2. Compositor 7

in recent GPUs. Instead of having clients rendering to system memory they can rendertheir content to textures stored within the graphics memory. When the client’s contentfinally is in the graphics memory, visual manipulation of textures can be seen as freecompared to letting the CPU do all the rendering.

Window System

DisplayTexture

CompositorTexture

Client

Client

Client

Figure 2.2: The compositor approach as it is used in a modern window system.

Currently the compositor technique is being adopted on all the popular desktop windowsystems, Quartz Compositor in Mac OS, X Composite Extension in X Window Systemand Desktop Window Manager in Microsoft Windows.

When it comes to composite in mobile graphics things becomes more complicated, thisis due to limited resources in both computing power and graphics hardware. The main3D graphics standard for mobile graphics is OpenGL ES and with the initial version,which is currently deployed in mobile graphics hardware, there are some limitations thathas great impact if the intention is to render a user interface. This is because OpenGLES was first developed with games as the main purpose of use. The limitation of texturesizes is one of the problems, a texture in OpenGL ES can not be larger than 64 ∗ 64texels in size and must be in power of two, 2, 4, 8, 16 etc. This makes the handlingof content data very clumsy and some tiling mechanism needs to be implemented tomake efficient use of the graphics memory. Then OpenGL ES have support for texturecompression, which could be utilized to keep down the size of textures. However, thisis not usable in a user interface implementation where textures needs to be updatedvery often, in opposite to games where a texture could be uploaded once and then notchange during the entire life cycle of the game. These problems do not exist in the 2.0version of OpenGL ES which is a much more complete standard and thereby requiresmuch more from the hardware. This is why there is no such hardware on the marketyet.

2.2.1 Quartz Compositor

Apple was early ahead and released in March 2001 the Quartz Compositor along withMac OS X v10.0. Although, Quartz Extreme released with Mac OS X v10.2 in August2002, was the first version that used graphics hardware for the compositing[5]. Thisboosted up the performance remarkably and ever since Apple has been the leader ofeye-candy and visual effects in the desktop user interface.

Page 18: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

8 Chapter 2. Mobile Graphics

2.2.2 X Composite Extension

With the X Window System, the X Composite Extension was released in October 2003,which defined a protocol extension[4] that enables clients to become compositing man-agers and redirect other clients output to off-screen pixmaps. These pixmaps can thenbe used in the compositing stage when the screen needs to be repainted. In this way,an ordinary client could act as an compositing manager. This opens up the possibilityof being able to change the way the screen is composited by changing the compositingmanager, in the same way as it is possible to change the window manager in the X Win-dow System. The compositing is carried out by rendering all visible redirected windowsto one single visible window that is not redirected.

The X community is currently working on ways to let compositing managers reach downto the hardware, preferably through an OpenGL interface and utilize it for compositing[10].The trick is to let clients connect windows to OpenGL textures. This is done throughan new extension in the GLX Application Programming Interface (API), which thenmakes it possible to render the entire screen by using the OpenGL API.

2.2.3 Desktop Window Manager

Desktop Window Manager (DWM) is the name of the compositor that is supposed tobe delivered along with the next version of the popular Windows operating system,Windows Vista[18].

2.2.4 OpenGL ES

OpenGL ES (OpenGL for Embedded Systems) is a subset of the desktop OpenGL 3Dgraphics API[7]. The main purpose with OpenGL ES was to create a flexible low-levelinterface between software and graphics hardware. OpenGL ES is a royalty-free APIdefined and promoted by the Khronos Group, an industry consortium that focus oncreating open APIs for graphics and multimedia.

OpenGL ES 1.0 was created based upon OpenGL 1.3. To decrease the complexity muchof the functionality was left out, and a little added. One thing added is the support forprofiles, version 1.0 of OpenGL ES include two profiles, the Common and the CommonLight profile. The Common Light profile is the most stripped down one and is a subsetto the Common profile.

OpenGL ES introduce the use of fixed-point types instead of floating points for vertexcoordinates and attributes to better support small embedded platforms, which oftenlack hardware floating point operations. The Common profile support both fixed pointand floating point types while the Common Light profile only support the new fixedpoint types.

Another significant thing removed is the old glBegin followed by glEnd call semantic,in favor of vertex arrays that is a far more efficient way of passing data through theAPI. Also primitives such as Quads and Polygon were left out because they added

Page 19: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

2.3. Vector Graphics 9

unnecessary complexity to the API when they easily could be made up by triangles,that is the only supported primitive.

2.3 Vector Graphics

Using vector graphics in mobile application have some advantages, for examples, imagesin vector format can be smaller in size than the same images in GIF format (Raster-based). A raster-based image format encodes the color for each pixel and a vector-based only contains drawing instructions that will determine the color of the pixels.Another big advantage with vector graphics is scalability, all images in vector formatis scalable, which is good when dealing with mobile devices. Scalability enables cleantransformations on images in vector format because drawing instructions is resolution-independent.

Using vector graphics does not mean that there is no use of bit maps. The display ona mobile device is a raster device meaning that vector graphics has to be translatedinto bit maps which can be displayed. This translation is done when all sizes and theresolution is specified, in the last possible moment.

A common thing is to utilize vector graphics in rendering of text, thereby achievingscalable fonts, which can provide characters at any size. Scalable fonts sometimes arecalled outline fonts because the method most common to represent scalable fonts is todefine the outline of each character.

Hardware-accelerated vector graphics in mobile devices should be possible through theuse of OpenVG[7] in a near future. If hardware-accelerated vector graphics would be-come a reality it would be most desirable to have a mobile user interface that is com-pletely in vector-based format, so the hardware-acceleration only will be there when itis switched on.

2.3.1 SVG Tiny

Scalable Vector Graphics (SVG)[6] is a vendor-independent 2D vector graphics standard,defined by the World Wide Web Consortium (W3C)[3]. SVG is a composite of an XMLbased format and a programming API which can be used by graphical applications.

SVG Tiny is one of the two profiles that was created to meet the demands and request fora SVG more suited to display vector graphics on small devices. There are also differentcharacteristics between different mobile devices, such as CPU speed, memory size andcolor support. Because of this W3C[3] defined two SVG profiles for mobile devices, onelow-level, SVG Tiny (SVGT), suited for example mobile phones and one high-level, SVGBasic (SVGB), suited for example PDAs.

SVG Tiny is designed to allow SVG to render on a mobile device and takes in consid-eration the memory, CPU power and bandwidth limitations. So with the use of SVGT,one can use SVG in a mobile phone.

Different mobile devices differ in screen resolution and by using an image in a raster-only

Page 20: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

10 Chapter 2. Mobile Graphics

SVG Tiny

SVG Basic

SVG

Figure 2.3: SVG Tiny is a subset of SVG Basic, and SVG Basic is a subset of SVG.

format, for example PNG and JPEG, there must be one images for every new resolutionfor the result to be consistent. With SVG there only needs to be one file describing theimage with the use of geometric objects, for example, lines and curves. These geometricobjects are then rasterized to the desired resolution.

The content of an SVG can be changed by using features like rotations and dynamicscaling. SVG can have video or audio elements specified using xlink:href to link tothe content and for most cases it uses the timing features in SMIL[16] to start and stopat the right time. SVG is a time-based language, meaning that the frame rate for ananimations will be calculated at the same time the animation is shown on screen, thiswill eliminate the need for frame drop.

SVG can be used for almost anything, for instance maps and in user interfaces. To viewSVGs, one need an implementation of the standard. Some of the most popular SVGTand SVGB implementations comes from Ikivo[2] and Bitflash[12].

To conclude SVG should be used to create a user interface for mobile devices that willprovide a good user experience.

2.3.2 OpenVG

OpenVG is a royalty-free open standard API designed by the people behind OpenGLES (the Khronos Group)[7]. OpenVG is a step towards hardware-accelerated vectorgraphics and it is an API for low-level 2D vector graphics rendering (version 1.0 of thespecification was released on August 1, 2005).

OpenVG’s aim is primarily at mobile phones, PDAs, game consoles[7] and some otherdevices with small screens (at least 128 x 128 non-indexed RGB color pixels with 4 ormore bits per channel[7]), but it should be possible to use OpenVG on any device that iscapable to support OpenGL ES 1.1[7]. In the future the hardware manufacturers shouldbe able to provide acceleration for OpenVG functionality.

Page 21: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

2.3. Vector Graphics 11

OpenVG will enable manufacturers to create better looking user interfaces using vectorgraphics that also should be less dependent on energy-consuming CPUs. Vector graphicsprovides an easy way for scalability with high-quality rendering to different screen sizeswithout the use of multiple bitmaps. Because of the scalability, it becomes easier toport the content of one device to another device.

Today’s desktop uses vector graphics through packages such as Flash and SVG. OpenVGis suited to accelerating both Flash and SVG sequnces, for instance OpenVG mustprovide efficient ways to implement all SVGT features[7].

Page 22: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

12 Chapter 2. Mobile Graphics

Page 23: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Chapter 3

Visual Effects Engine

This chapter describes the realization of the Visual Effects Engine, a prototype archi-tecture for handling visual effects and transitions in a mobile graphical system.

The current ways of handling visual effects are explained and related problems areidentified. The approach to these problems and the proposed solutions are presented. Adesign of the architecture and its interface is described. Client usage of the architectureis described with focus on the user interface.

3.1 Visual Effects Today

There are currently two different ways to apply visual effects on elements in the graphicalsystem. One way, which is the older way of doing things, is to utilize a graphics librarywhich operates on canvases (arrays of pixels). This method require direct access to theframe buffer while performing animations. This requirement generates some limitationsin how visual effects can be carried out and it pretty much eliminates the possibility ofhaving concurrent animations running on separate objects, occupying the same screenarea. The other, newer way, is to utilize an animation interface towards the compositorin the window system. This make it possible to animate different properties on windowsand sub windows.

In current user interfaces, visual effects and transitions are handled in a somewhatstatic fashion. Changing the visual behavior of an object in the user interface requiresthe software for that component to be modified, even for small changes like tweaking thespeed of an animation. This means that when designers come up with new demands onhow graphical objects should behave, software need to be rewritten and verified. Thisbecomes a very tedious process as many units gets involved.

13

Page 24: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

14 Chapter 3. Visual Effects Engine

3.1.1 Realization Process

When visual effects are realized, roughly three different groups of developers are in-volved; designers, user interface developers and graphics developers. A big part of thedevelopment is about communication among these groups, here follows an explanationon how this communication could look.

1. Designers gather information from Graphics on what is possible to do with thecurrent graphical system.

2. Designers start the design phase, produces a prototype in some high level graph-ical tool, e.g. FlashTM. This prototype is then delivered to the user interfacedevelopers.

3. User interface developers, analyze the prototype and divide it into several smallerproblems, which then are adjusted to fit their graphical components. Some func-tionality is often left out when it becomes too difficult to integrate in the system.

4. User interface developers then request tools and specific solutions from graphicsdevelopers to realize the visual effects.

5. Graphics developers develop and deliver tools and specific solutions for user inter-face developers.

6. User interface developers present the result to designers.

7. Designers then compare the result with the prototype delivered, if they are notsatisfied further communication and work are needed.

3.1.2 Problems

Here follows a list with the problems identified with this way of handling visual effects.

– Overhead in the communication. As described in Section 3.1.1, the realization ofvisual effects involve three groups of people: designers, user interface developersand graphics developers. When designers come up with new demands on theuser interface that involves visual effects, instead of requesting visual effects directfrom graphics developers, they must always make their requests via user interfacedevelopers. Another aspect is that different groups of developers often have theirown terminology, sometimes the same word means two different things, dependingon who you ask.

– Designers often wants all or nothing. When a designer has requested a certainbehavior on a graphical object and only a part of it is realizable, it may be betterto leave it all out as it will result in inconsistency in the user interface. This isdifficult for a developer to know when the whole problem is broken down in smallpieces and realized one at the time.

Page 25: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

3.2. Approach 15

– Short-term solutions. When problems are broken down and solved by differentdevelopers, it can be hard to see the whole picture. The results is often a non-general solutions, which can be hard to reuse when demands for changes arise.Lack of time is also a contributing parameter.

– Graphics related code ends up in the user interface developers code base, it is thenup to them to maintain the code. This is a problem when the code often is writtenby graphics developers as a solution to a specific problem.

– User interface developers must deal with extra logic related to visual effects. Forexample when an object should slide in from right, the object must first be posi-tioned to the right outside the screen and then be animated to the final position.

– Transitions between applications and sometimes different technologies are not pos-sible as there is no framework that handles visual effects between arbitrary graph-ical objects.

3.2 Approach

The first thing decided was on which component in the graphical system this new ar-chitecture should operate on. After investigating the graphical system and taking allfundamental requirements in consideration, it was obvious that the best component tooperate on was windows in the window system. Choosing a higher level such as userinterface components, would result in only applications within the user interface couldutilize the architecture. On the other hand, choosing a lower level and bypass the win-dow system, thus operate directly against the graphics library would make it much moredifficult to control the visual effects as there is no satisfying way to, from a high level,point out which graphical element on the screens should be manipulated.

As there are several groups of developers who could make use of this new architecture,it was important to get a feel on what their expectations were, which problems they sawas the biggest and what type of interface they could work with.

Then some case studies were made, partly to identify tricky situations but also to geta feel of how our proposed architecture could produce the same visual experience as incurrent user interfaces used in SEMC phones.

3.2.1 Fundamental Requirements

This section describes requirements that were stated at an early stage of the designphase, many of them are results from problems associated with the way visual effectsare currently handled.

– Utilizing visual effects through this architecture should be enabled for all applica-tions and processes in the system, not only for the user interface. This requirementhas a great impact on which component in the graphical system the architectureshould operate on.

Page 26: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

16 Chapter 3. Visual Effects Engine

– Designers should be able to make their own visual effects. It is therefore importantthat defining visual effects becomes an easy task, not requiring any programmingskills.

– Clients to the architecture should be able to share visual effects with other clients,making it possible to inherit default behavior for certain objects and extend withself defined behavior when needed.

– The architecture should have a well defined interface and a requirement was thatit should be implemented with the use of the Interface Definition Language (IDL).

3.2.2 Graphical System Overview

The graphical system consists of several layers. Each layer has some important func-tionality that is provided to higher layers by abstracting details in the underlaying layer.This overview presents a very simplified and more of a theoretical version of the graphicalsystem and focuses on the layers used in the user interface, see Figure 3.1.

User Interface Application

UI ComponentJava

Window System

Compositor

Graphics LibraryDirect access

Hardware

Figure 3.1: Illustration of the layering in the graphics system.

The user interface can be seen as an application among others on top of the graphicalsystem. It consists of user interface components that are created and destroyed dynami-cally when navigating the user interface. Such a component could for example be list ora dialog. These components then consists of one or several windows, used for displayingits content on the screen through the window system.

The window system provide basic windowing functionality in the system. It runs in aseparate process with its own context, encapsulating the underlaying compositor. Thecompositor has a control interface, accessible through the window system interface. Thecompositor approach is described in Section 2.2. The compositor is communicating withthe hardware through some kind of graphics library.

Page 27: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

3.2. Approach 17

3.2.3 Involved Units

The units, Design, User Interface and Graphics are the three potential units discussedin this thesis and consists of designers, user interest developers and graphics developersrespectively. At an early stage in the design phase there were some meetings with design-ers and developers. The intention with these meetings were to discuss their wishes andexpectations on the architecture which then had a great impact on how the architecturewas shaped.

Graphics

Graphics is probably the unit responsible for maintaining this architecture if it becomesreality. Therefore, their expectations was to be able to provide an architecture thatwould abstract management of visual effects. Visual effects code should then only needto exist within Graphics.

User Interface

User Interface is a large unit and is responsible for many modules in the system. Thisarchitecture could have a great impact on how visual effects and transitions are managedin the future. A big issue for User Interface is the question of responsibility. They wouldlike to see this architecture taking over some of the extra work generated by having visualeffects in the user interface. There are also extra logic directly related to visual effectsthat they would gladly hand over to Graphics. For instance when a list item is to beanimated from the edge of the screen to its destination position, an extra positioning ofthe object is required. Preferably, User Interface wished the architecture to take over alllogic in the list layout, which is probably the most complex object in the user interface.However this would be the job for a layout manager and was not the intention with thisarchitecture.

Design

One main goal with this architecture was to make designers more involved in the creationof visual effects, giving them a tool to define visual effects in a high level language.

The first impression from the meeting with the designers was that they had a completelydifferent way of looking at visual effects and transitions. The focus was entirely on whatthey wanted to be able to do and not so much on what is possible.

After explaining the intention with the architecture and what it was supposed to solve,some of the questions could be answered. They liked the idea of them being able todefine visual effects by combining a set of base effects and the freedom to specify theirown keyframes to each base effect.

Here follows the list of base effects that designers wanted the architecture to handle.

Page 28: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

18 Chapter 3. Visual Effects Engine

– Move - Animate the position of an object.

– Scale - Animate scaling of an object.

– Opacity - Animate the level of transparency.

– Rotate - Animate rotation.

– Skew - Animate the skewing of an object.

– Colorize - Animate color change with an object.

– Blur - Animate the amount of blur with an object.

Another big question was if designers could accept the XML format, supposed to bethe base of the high level language in which visual effects should be defined. To bemore specific, could they take this format and start writing their own visual effect whenthe architecture is ready. The answer from the designers was that they were familiarwith XML but were uncertain if they could start writing effects with it, a tool forgenerating such XML files was desirable, preferably a graphical tool with drag and dropfunctionality. This is however outside the scope of this thesis, although such a toolwould probably not be too hard to create.

3.2.4 Case studies

The goal with the studies was to identify elements in the user interface and what eventsthat could be used to generate visual effects. Most of the cases studied were based onvisual effects currently used in the SEMC user interface. Although, when a case wasstudied and confirmed, the next step was to come up with other effects that could beachieved by just modifying the current effect definitions. Studies on the Pop-up and theTop Menu, also known as the ”Desktop” are described bellow.

Pop-up

Figure 3.2: The Pop-up user interface object.

Page 29: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

3.2. Approach 19

The Pop-up is a really simple object, consisting of just one window. The events thatwould be significant to generate, are probably a ”Enter” on creation and a ”Leave” ondestruction. However, a Pop-up is used in many different situations so it is probably notenough to only identify it as a ”Popup”. For example, a ”AlarmPopup” for the alarmand a ”InfoPopup” for an information dialog.

Top Menu

The Top Menu is really an ordinary list, with a different layout, where list items arearranged in a grid. It consists of the following objects, windows and sub windows.

Figure 3.3: The Top Menu user interface object.

– Main Window, the window containing all the following components of the TopMenu.

– List Item - One for every entry on the Top Menu, visible only when the entry isnot selected. When the entry gets selected, this item gets replaced by the SelectedList Item with a nice scale and fade transition.

– Selected List Item - One for every entry on the Top Menu, only visible for theselected entry, as described above.

– Background Highlight - Visible behind the selected entry and is moved smoothlywhen the selected entry changes.

When the Top Menu is created, an explode effect is carried out for all visible list items.The list items are then animated from the position of the selected list item to theirfinal position, scaled up and faded in at the same time. Defining this visual effect withXML seemed as a big challenge and had a great impact to how the architecture shouldoperate.

Then the question is what events that should be generated and on which objects. Herefollows a list of visual effects that should be generated on objects and some proposalson events that could be used.

Page 30: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

20 Chapter 3. Visual Effects Engine

Intro effect - To be able to achieve some kind of intro effect, all visible items must getthe ”Enter” event at the same function call, there must also be a way to find outwhich item that was selected. The highlight should also get the ”Enter” event butnot on the same call, as it should be seen as a separate object.

Change items - When the Top Menu is navigated, items get selected an unselected.To be able to make a transition from the unselected item into the selected item,both these objects must get two different events, for example ”Hide” and ”Show”.The same goes for the entry which should do a transition from the selected itemto the unselected one.

Highlight move - This should probably be done with a ”Move” event which also mustcontain the new position. Then there is the case when the highlight should moveover the edge of the Top Menu and enter on the opposite side. Should this betaken care of by the effect definition or is it better to generate a different event.Such an event could for example be a ”MoveWrap” event.

3.3 Architectural Design

The general idea behind the architecture is to see windows as objects, on which eventscan be thrown to trigger visual effects. Objects, events and effects are all identifiedby arbitrary strings, so scalability is maintained and no unnecessary limitations areintroduced.

The architecture is based on a client/server model and should be seen as a ”black box”in the system, running in its own process context. Communication with the architectureis carried out through a well defined proxy interface, allowing clients to register and geta client id, which then is used when loading files describing visual effects and triggeringthem on graphical elements.

Effects

JAVAUISUDOKU

SYSTEMWINDOW

HARDWAREGRAPHICS LIBCOMPOSITOR

Visual

Engine

Figure 3.4: System overview with the new architecture in place.

Page 31: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

3.3. Architectural Design 21

The idea is that every client using the architecture should be able to register XMLfiles defining visual effects and the relations among objects, events and visual effects.Every new file registered will override definitions in previously registered files until it isunregistered.

When effects are triggered, an object and event name is passed along, which then are usedto find the right visual effect that should be applied. Object event relations and visualeffects are defined and stored separately in the architecture, thus making it possible tohave relations and visual effects defined in separate files.

Object event relations and visual effects definitions can also be shared among clientswith a provide/inherit mechanism. This mechanism is controlled through the clientinterface, and makes it possible to set, provide and inherit labels. These labels are thenused when effects are triggered and relations and visual effects are looked up. If norelation or visual effect is found within the client, the inherit label is used to see if someother client is providing this label. If the label is provided, the relation or visual effectis searched for in the clients definitions.

Where in the system the architecture is located is roughly illustrated in figure 3.4. Alsoillustrated is how the clients communicate both with the architecture and the windowsystem.

The architecture can be divided into three parts, client interface, FXML parser andinternal data representation, as illustrated in figure 3.5. These parts are described inSection 3.3.1, 3.3.2 and 3.3.3 respectively. However, this is a simplified view of thearchitecture and for a more detailed description of the internal structure see Section3.3.4.

Visual Effects Engine

RepresentationInternal Data

InterfaceClient

ParserFXML

Figure 3.5: The three main parts of the architecture.

3.3.1 Client Interface

The client interface towards the Visual Effects Engine consists of seven function callsdefined in IVisualEffectsEngine, which is a IDL interface towards the architecture. Thesefunctions are described below and the full interface definition is included as AppendixB. All special types that are defined and used with the IVisualEffectsEngine interfaceare defined with IDL in the file VisualEffectsEngine types.idl, included as Appendix C.

Page 32: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

22 Chapter 3. Visual Effects Engine

RegisterClient

This function is used to register a client to the Visual Effects Engine, a client id isreturned, which should then be used in all further communication with the Visual EffectsEngine.

RVoid

RegisterClient(

[out] TVisualEffectsClientID* pClientID);

UnregisterClient

This function unregisters a client, a valid client id should be passed along the functioncall.

RVoid

UnregisterClient(

[in] TVisualEffectsClientID clientID);

RegisterEffectFile

With this function it is possible to load a XML file and retrieve a handle for thatregistration. The first parameter is a client id, identifying which client these files shouldbelong to. Next parameter is a string, pointing out in which directory the file is located.Following parameter contains the name of the file. If the registration succeeds, a handleis returned, which can be used to unregister the file later on.

RVoid

RegisterEffectFile(

[in] TVisualEffectsClientID clientID,

[in] TChar* pDir,

[in] TChar* pFile,

[out] TVisualEffectsHandle* pVisualEffectsHandle);

UnregisterEffects

This function is used to unregister effect files previously registered. The first parameteris the client id, identifying the client. Followed by the handle to the effect file thatshould be unregistered.

Page 33: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

3.3. Architectural Design 23

RVoid

UnregisterEffects(

[in] TVisualEffectsClientID clientID,

[in] TVisualEffectsHandle visualEffectsHandle);

TriggerEffects

This is the most important function towards the Visual Effects Engine, as it is thefunction making things start moving on the screen. It triggers visual effects on windows,which visual effect is carried out depends on the effects registered and the combinationof object name and event name passed along the function call. Other parameters whichalso affect are the event group and event data.

RVoid

TriggerEffects(

[in] TVisualEffectsClientID clientID,

[in, size_is(nWindowHandles)] TWindowHandle* pWindowHandles,

[in] FUint8 nWindowHandles,

[in] FUint8 activeWindowIndex,

[in] TChar* pObject,

[in] TChar* pEvent,

[in] TVisualEffectsEventGroup eventGroup,

[in] TVisualEffectsEventData* pEventData);

The first parameter is the client id, identifying the client. Then there is an array ofwindow handles of the windows which should receive the visual effects. The followingparameter contains the number of windows passed. Then there is the active windowparameter which is used when several windows are passed and one of them is specialin some way, enabling for effects to be defined to treat the active window special. Forexample, the case where all list items in a list is passed, the item which has focus couldbe set as active.

The next two parameters specify the object and event name with an arbitrary strings,which then are used to find the right visual effect that should be applied.

The event group parameter is used to group events together, allowing the Visual EffectsEngine to perform the visual effects correctly. The event groups are described bellowand the definition in IDL is included as Appendix C.

NONE - No particular grouping.

MOVE - A move event, which requires a position.

MAP - A map event, meaning that the object receiving the visual effect should becomevisible.

MAP ON POSITION - Same as with MAP, but with a position where the objectshould end up.

Page 34: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

24 Chapter 3. Visual Effects Engine

UNMAP - An event making the object disappear, this will keep the object visible untilthe visual effect has finished, even though it gets explicitly hidden meantime.

UNMAP AND DESTROY - Same as with UNMAP only difference is that objects willnot be destroyed until the visual effects have finished.

The last parameter is event data, which is used when the event group specified needssome extra information. Currently, only a position is needed to be passed along withthe function call. The position can also be defined as a relative or absolute position.However, the position is always a position within its parent’s local coordinate system.

ProvideLabel

This function will help clients to share their relations and effects with other clients, bymaking them available through a label. For example, a client could register a bunch ofeffect files and then set the provide label to ”providing client”, which then other clientscould use by setting the inherit label to the same, using the InheritLabel function.

The first parameter passed is the client id, identifying the client and the second param-eter is the label, which is an arbitrary string.

RVoid

ProvideLabel(

[in] TVisualEffectsClientID clientID,

[in] TChar* pLabel);

This functionality is, however, not fully implemented in the prototype architecture, butthe architecture is well prepared for it so it should not be that difficult to get it working.

InheritLabel

This function is used to inherit relations and effects from another client, by specifyinga label, which has been provided with the ProvideLabel function.

The first parameter passed is the client id, identifying the client and then the nextparameter is the label as an arbitrary string.

RVoid

InheritLabel(

[in] TVisualEffectsClientID clientID,

[in] TChar* pLabel);

Like with the ProvideLabel function, this functionality is not fully implemented in theprototype architecture.

Page 35: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

3.3. Architectural Design 25

3.3.2 XML Parser

The XML parser is based on a Simple API for XML (SAX) parser, provided by theXML Toolkit module in the SEMC system. A SAX parser reads XML data as a streamand invokes events through a callback interface when tags are discovered. It is then upto the user to decide what to do with the data.

The parser can parse XML files written in FXML, described in Chapter 4 and store themin an internal data representation, described in Section 3.3.3. As FXML was designedto be easily extended, this becomes equally important with the design of the parser.With the current design the parser can easily be extended with new attributes and baseeffects.

3.3.3 Internal Data Representation

Visual effects are triggered on objects in real-time, making the time it takes to findthe right visual effects an important aspect. Time is also an issue when keyframes aregenerated before the visual effect can be applied.

As almost all data in the architecture is identified with strings, search trees are used forstoring much of the data, making the solution scale when the amount of data increases.However, worst case complexity of such a tree is On for a search operation, which meansthat every node is visited and perhaps storing data in hash tables would be a bettersolution.

With FXML it is possible to describe visual effect in a very flexible manner, whichhas a great impact on how visual effects are stored within the architecture. When theFXML file is parsed by the parser, there is no way of knowing anything about the finalkeyframes, which will be passed down to the compositor, as all visual effects becomeunique for every object. For example, it is possible to specify a relative position on theX-axis to 10%, which means the position is 10% of the width of the parent object plusthe current position of the object. It is also not possible to have the FXML files parsedevery time a visual effect is to be carried out, it would be far to expensive.

The solution is to have an internal representation of the visual effects, where attributevalues are calculated on the fly when visual effects are applied.

3.3.4 Internal Structure

The internal structure is built up mainly by IDL components as illustrated in Figure3.6. Only two of the components are exposed to the clients, the VisualEffectsEngineand the VisualEffectsEngineManager. The VisualEffectsEngine is the main componentin the architecture and implements the client interface towards the architecture. It alsoacts as a separate process in the system with its own context, meaning all componentswithin is allocated in its memory space, enabling for sharing of data between clients ina controlled manner.

The VisualEffectsEngineManager which is the other component exposed to the clients

Page 36: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

26 Chapter 3. Visual Effects Engine

VisualEffectsEngine

+RegisterClient()

+UnregisterClient()

+RegisterEffectFile()

+UnregisterEffects()

+TriggerEffects()

+ProvideLabel()

+InheritLabel()

VisualEffectsClientData

+GetEffectName()

+GetEffect()

+RegisterEffects()

+UnregisterEffects()

+SetProvideLable()

+GetProvideLabel()

+SetInheritLabel()

+GetInheritLabel()

VisualEffectsData

+SetEffectName()

+GetEffectName()

+SetEffect()

+GetEffect()

VisualEffectsParser

+Parse()

VisualEffectsFX

+Init()

+Prepare()

+Animate()

VisualEffectsFXPosition

+InitPosition()

+AddKeyframe()

VisualEffectsFXScale

+InitScale()

+AddKeyframe()

VisualEffectsFXOpacity

+InitOpacity()

+AddKeyframe()

VisualEffectsFXContainer

+AddFX()

*

*

*

*

1

VisualEffectsEngineManager

+GetVisualEffectsEngine()

VisualEffectsManager

+CreateVisualEffectsClientData()

+CreateVisualEffectsData()

+CreateVisualEffectsFXContainer()

+CreateVisualEffectsFXOpacity()

+CreateVisualEffectsFXPosition()

+CreateVisualEffectsFXScale()

+CreateVisualEffectsParser()

External Interface

Figure 3.6: UML diagram over the internal IDL structure of the architecture.

Page 37: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

3.4. Client Usage 27

is used to establish connections to the architecture and the VisualEffectsEngine compo-nent. This connection becomes a proxy connection as the VisualEffectsEngine residesin a different contexts and contexts switches is required for all communication.

There is also another manager beside the external, which was described above and itis the internal manager. The internal manager is the VisualEffectsManager componentand it is used for creating all internal components in the architecture.

For every new client connection to the VisualEffectsEngine component there will be aVisualEffectsClientData component associated with that connection. The VisualEffect-sEngine component has a VisualEffectsParser component which is used to parse FXMLfiles when clients register effects definitions to the architecture. For every FXML fileparsed a VisualEffectsData component is created and stored inside the correspondingclients VisualEffectsClientData component. This VisualEffectsData component will con-tain a VisualEffectsFX component for every effects definition inside the parsed FXMLfile. However, the VisualEffectsFX component is a abstract component and the actualcomponents stored are the components inheriting from it. These components, one forevery base effect, are: VisualEffectsFXPosition, VisualEffectsFXScale, VisualEffectsFX-Opacity plus the VisualEffectsFXContainer which is a container component containingother VisualEffectsFX components. This component is used when effects definitionsconsists of several base effects.

3.4 Client Usage

To use this architecture is quite a simple task. All that is needed is a file in the filesystem complying with FXML as described in Section 4 and a window handle to awindow in Window System. Then a connection needs to be created to the Visual EffectsEngine, after that load the FXML file and start triggering effects on windows. How thisis done in C code is explained in Section 3.4.1.

Even though the architecture has a simple interface and is easy to use, this does notnecessarily mean the integration with the graphical system is going to be an easy task,especially not with the user interface. However, the user interface is probably the clientwho need this architecture the most, therefore some research has been made on how tointegrate the Visual Effects Engine in the user interface, see Section 3.4.2.

3.4.1 Quick Start

This section explains the basic steps it takes for a client to make a connection with theVisual Effects Engine and start triggering visual effects.

All communication with the Visual Effects Engine passes through the IVisualEffectsEn-gine interface, defined in the file IVisualEffectsEngine.idl, then there are some typesneeded defined in VisualEffectsEngine types.idl, see Appendix B and C.

To utilize the Visual Effects Engine the first step is to include some header files. The filesIVisualEffectsEngineManager.h and CVisualEffectsEngineManager.h are only needed

Page 38: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

28 Chapter 3. Visual Effects Engine

when the proxy connection is created.

#include "CVisualEffectsEngineManager.h"

#include "IVisualEffectsEngineManager.h"

#include "IVisualEffectsEngine.h"

Next, some variables need to be defined, for instance, a reference to the Visual EffectsEngine and a client id.

RVoid result;

ECM_DECLARE_IPTR(IVisualEffectsEngineManager,

pIVisualEffectsEngineManager);

ECM_DEFINE_IPTR(IVisualEffectsEngine,

pIVisualEffectsEngine);

TVisualEffectsClientID visualEffectsClientID;

TVisualEffectsHandle visualEffectsHandle;

TWindowHandle windowHandle;

Now it is time to create a proxy connection towards the Visual Effects Engine. Thisis done via a Visual Effects Engine Manager that first needs to be created. After theproxy interface is retrieved, the manager can be destroyed.

result = IShell_CreateInstance(

OSE_GetGlobalShell(),

&CID_CVisualEffectsEngineManager,

&IID_IVisualEffectsEngineManager,

(IRoot **)&pIVisualEffectsEngineManager);

result = IVisualEffectsEngineManager_GetVisualEffectsEngine(

pIVisualEffectsEngineManager,

&pIVisualEffectsEngine);

ECM_RELEASE_IPTR(pIVisualEffectsEngineManager)

Next step is to register the client with the Visual Effects Engine and retrieve a client id.This id is then used in all further communication with the Visual Effects Engine.

result = IVisualEffectsEngine_RegisterClient(

pIVisualEffectsEngine,

&visualEffectsClientID);

Before any effect can be triggered, at least one FXML file need to be registered. Thiscode will register the file visualeffects.xml located in the Others directory in the phone.

Page 39: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

3.4. Client Usage 29

result = IVisualEffectsEngine_RegisterEffectFile(

pIVisualEffectsEngine,

visualEffectsClientID,

L"/tpa/user/Other",

L"visualeffects.xml",

&visualEffectsHandle);

Now it is time to trigger some effect on a window, although the windowHandle shouldprobably be initialized first and bound to a window, this is however outside the scopeof this explanation. What this does is that it associates a window with an object name,in this case ”MyObject”, and then throws the event ”MyEvent” on it. Then it is up tothe Visual Effects Engine to find out which effect that should be applied.

result = IVisualEffectsEngine_TriggerEffects(

pIVisualEffectsEngine,

visualEffectsClientID,

&windowHandle,

1,

0,

L"MyObject",

L"MyEvent",

VISUAL_EFFECTS_EVENT_GROUP_NONE,

NULL);

3.4.2 User Interface Integration

The user interface is a very large and complex framework with its own objectificationmechanism. To take full advantage of the Visual Effects Engine architecture somemodifications needs to be done.

Besides modifying the user interface, user interface developers and designers must cometo an agreement, on which objects in the user interface that should be able to have thevisual effects. They also need to agree on which identifications to give them and thendetermine which events to use on them. This is important, because designers want tomap objects and events to effects in a FXML file and programmers must throw eventson objects.

3.4.3 Examples

This section will show some examples of how visual effect on object can be realizedthrough FXML, which is explained in Chapter 4.

Page 40: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

30 Chapter 3. Visual Effects Engine

Pop-up

This example makes a pop-up slide in from above, to be placed centered for a while andthen continue to slide out below. The following three pictures illustrate the visual effectand then there is the XML file defining it.

Figure 3.7: A pop-up sliding in from above and out below.

<?xml version="1.0"?>

<fxml version="1.0">

<object name="PopUp">

<event name="Enter">

<effect name="SlideInFromAbove"/>

</event>

<event name="Leave">

<effect name="SlideOutToBelow"/>

</event>

</object>

<defineEffect name="SlideInFromAbove">

<position>

<positionKeyframe time="0ms" y="AboveParent" />

<positionKeyframe interpolator="EaseOut" time="800ms" y="Destination" />

</position>

</defineEffect>

<defineEffect name="SlideOutToBelow">

<position>

<positionKeyframe time="0ms" y="Source" />

<positionKeyframe time="800ms" y="BelowParent" />

</position>

</defineEffect>

</fxml>

Top Menu

This example makes the Top Menu items to move up from behind the item that hashighlight and then be placed on their designated places. The following is the picturesthat illustrate the visual effect and the XML file defining it.

Page 41: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

3.4. Client Usage 31

Figure 3.8: A Top Menu explode animation.

<?xml version="1.0"?>

<fxml version="1.0">

<object name="TopMenu">

<event name="Enter">

<effect name="ExplodeTopMenuDelux"/>

</event>

</object>

<defineEffect name="ExplodeTopMenuDelux">

<position multiBehavior="BottomToTop" multiDelay="150ms" >

<positionKeyframe time="0ms"

x="ActiveObjectSource" y="ActiveObjectSource"

dx="12px" dy="10px" />

<positionKeyframe time="600ms" x="30%" y="AboveParent" dy="Height" />

<positionKeyframe time="1000ms" x="Destination" y="Destination" />

</position>

<opacity multiBehavior="BottomToTop" multiDelay="150ms" >

<opacityKeyframe time="0ms" level="0%" />

<opacityKeyframe time="200ms" level="100%" />

</opacity>

<scale multiBehavior="BottomToTop" multiDelay="150ms" >

<scaleKeyframe time="0ms" x="50%" y="20%" />

<scaleKeyframe time="600ms" x="120%" y="120%" />

<scaleKeyframe time="1000ms" x="100%" y="100%" />

</scale>

</defineEffect>

</fxml>

Page 42: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

32 Chapter 3. Visual Effects Engine

Page 43: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Chapter 4

Visual Effects MarkupLanguage

Visual Effects Markup Language (FXML), is an own defined markup language usedby the Visual Effect Engine, to describe how visual effects and transitions should act.FXML is defined using XML and a complete definition in DTD of the language isincluded in Appendix D. FXML is meant to be a simple but yet powerful markuplanguage, and the targeted users are designers.

The following is an example of a FXML document, it makes a pop-up slide in from righton the event ”Enter” and slide out to left on the event ”Leave”. The events should bethrown by the client at appropriate times, ”Enter” when the window is to be mappedand ”Leave” when it is to be unmapped.

33

Page 44: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

34 Chapter 4. Visual Effects Markup Language

<?xml version="1.0"?>

<fxml version="1.0">

<include>file.xml</include>

<!-- Speed constants -->

<defineConstants>

<constant name="VeryFast" value="90mms" />

<constant name="Fast" value="70mms" />

<constant name="Slow" value="50mms" />

<constant name="VerySlow" value="30mms" />

</defineConstants>

<!-- TextFeedback -->

<object name="TextFeedback">

<event name="Enter">

<effect name="SlideInFromRight"/>

</event>

<event name="Leave">

<effect name="SlideOutToLeft"/>

</event>

</object>

<!-- Effect Definitions -->

<defineEffect name="SlideInFromRight">

<position>

<positionKeyframe time="0ms" x="RightOfParent" />

<positionKeyframe time="800ms" x="Destination" />

</position>

</defineEffect>

<defineEffect name="SlideOutToLeft">

<position>

<positionKeyframe time="0ms" x="Source" />

<positionKeyframe time="800ms" x="LeftOfParent" />

</position>

</defineEffect>

</fxml>

4.1 XML

Extensible Markup Language (XML)[3] is a subset profile of Standard GeneralizedMarkup Language (SGML, ISO 8879:1986). XML is designed to be easier to process andparse than SGML. SGML is used to define markup languages, e.g. HTML, to structuredocuments, so they easily can be translated to different medias. This is useful when

Page 45: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

4.2. FXML Tags 35

documents are to be moved from one system to another without any loss of data.

In HTML every tag, attribute and attribute value is defined to describe somethingthat has to do with a web page. It would be complicated to use HTML to describesomething that is not a web page. In XML or the other hand, it is only possible todetermine that a tag is a tag and an attribute is a attribute, there is no predeterminedrelation between tags in the XML document. The meaning of a tag in XML will be setwhen the application that uses the XML document parses it.

XML has the advantage that almost anything can be described. Some examples whereXML is used is SMIL and XHTML (used for web pages). If one is to write an applicationthat wants to use data described with the help of XML, one must first agree how todescribe it with XML, such as which tags and attribute to use. It is also a goodthing to create a Document Type Definition (DTD) to validate the syntax of a XMLdocument used for the application. This validation process is done in the same momentthe document is created. So all that is needed is to agree on a XML structure, create aDTD for that structure and a parser.

4.2 FXML Tags

This section will explain what different tags there are in FXML and how to use them.

4.2.1 fxml

Starting with the root tag fxml, which requires the version attribute to be set. Theonly version at this time is the 1.0. Everything that is relevant for FXML must beencapsulated by an fxml tag.

<fxml version="1.0">

...

</fxml>

4.2.2 include

The include tag makes it possible to split up the FXML definition in several FXMLfiles, e.g. if one wants to have all constant values defined using constant tags in aseparate FXML valid file. The file has to be a valid FXML file, because when the parserencounters an include tag, it will pause the current file and start to parse the includedfile recursively. When done, it will continue to parse the first file.

<include>some_fxml_file.fxml</include>

Page 46: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

36 Chapter 4. Visual Effects Markup Language

4.2.3 object

All properties of a certain object must be encapsulated by the object tag. Currentlysupported properties for an object are events, through the event tag.

<object name="arbitrary_string">

...

</object>

The object tag has the following attribute:

name - is an identifier in form of an arbitrary string that will match with corre-sponding string given by the TriggerEffects call, explained in Section 3.3.1.

4.2.4 event

The event tag is encapsulated by the object tag and contains information of what to doon a certain event, currently supported happenings are effects, through the effect tag.

<event name="arbitrary_string">

...

</event>

The event tag has the following attribute:

name - same as in object.

4.2.5 effect

The effect tag is encapsulated by event tags and defines which effect to apply. However,the effect name must be defined with a defineEffect tag in an effect file loaded in theVisual Effects Engine.

<effect name="arbitrary_string"\>

The effect tag has the following attribute:

name - is an identifier in form of an arbitrary string that will match with the nameattribute from a defineEffect tag.

Page 47: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

4.2. FXML Tags 37

4.2.6 defineEffect

This tag defines an effect by combining base effects. Current supported base effectsare position, scale and opacity. There can only be one of each base effect in an effectdefinition.

<defineEffect name="arbitrary_string">

...

</defineEffect>

The defineEffect tag has the following attribute:

name - is an identifier in form of an arbitrary string which is matched with thename of an effect tag.

4.2.7 position

This is the position base effect tag and it is used to define a position animation withina defineEffect tag. A position base effect is composed of positionKeyframe tags definingkeyframe values.

<position multiBehavior="Parallel" multiDelay="1ms"

speed="1mms" path="Straight">

...

</position>

The position tag has the following attributes:

multiBehavior - defines how the effect would be carried out when the effect istriggered on several objects. Currently defined behaviors concentrating on differentordering of the objects:

Parallel - effect is carried out in parallel the multiDelay attribute is disre-garded.TopToBottom - effect is carried out from top to bottom in the given orderwith a delay in between, defined by the multiDelay attribute.BottomToTop - opposite to the TopToBottom behavior.ActiveFirst - the object marked as active, from the call TriggerEffects (ex-plained in Section 3.3.1), will receive the effect first and then continuing withthe objects before and after in the given order and so forth, until every objecthas received the effect. A delay is used in between defined by the multiDelayattribute. This can be used if one wants the highlighted object in a list to beseparated in some way from the rest of the objects in the list, e.g. a cascadedmove effect.

Page 48: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

38 Chapter 4. Visual Effects Markup Language

Random - effect is carried out on objects in random order with a delay inbetween, defined by the multiDelay attribute.

multiDelay - defines a time in milliseconds (ms) that objects will be delayed whenthe effect is carried out in an order other than parallel.

speed - defines the velocity of an object when the position effect is carried out. Thisis only used when the positionKeyframe tag is used without the time attribute.The time value for such a keyframe is then calculated using the distance and thespeed. The speed attribute can be specified in either pixels per second (pxs) orin millimeters per second (mms). It is also desirable to be able to specify ”fuzzy”values like Fast or Slow etc. However this is not handled in the Visual EffectsEngine prototype. If implemented, it would be desirable to define these ”fuzzy”values in a configuration file. There can be one configuration file for each newphone model with new specifications. So to be able make the same visual effectswork on a different model, one is only reqiured to include a different configurationfile.

path - defines how the path should be routed, current supported values are:

Straight - is the straight way from one point to another.

Shortest - is the shortest way where it is possible to cross over from one sideof the screen and enter on the opposite side. This is not implemented in theVisual Effects Engine prototype.

4.2.8 scale

This is the scale base effect tag and it is used to define a scale animation within a defi-neEffect tag. The scale base effect is composed of scaleKeyframe tags defining keyframevalues.

<scale multiBehavior="Parallel" multiDelay="1ms"

speed="1mms" anchorPoint="Center">

...

</scale>

The scale tag has the following attributes:

multiBehavior - see multiBehavior in position tag.

multiDelay - see multiDelay in position tag.

speed - see speed in position tag.

anchorPoint - defines in which direction the scale effect will be carried out. Thesquare in figure 4.1 represent an object and the dots different anchor point thatcan be used to define a direction on a scale effect. The anchorPoint attribute canbe one of following:

Page 49: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

4.2. FXML Tags 39

Left - left point is the fixed point and the object will be scaled in directionfrom the left point.

Right - right point is the fixed point and the object will be scaled in directionfrom the right point.

Top - top point is the fixed point and the object will be scaled in directionfrom the top point.

Bottom - bottom point is the fixed point and the object will be scaled indirection from the bottom point.

Center - center is the fixed point and the object will be scaled in directionfrom the center point.

TopLeft - upper left corner is the fixed point and the object will be scaled indirection from that point. This is the default value.

TopRight - upper right corner is the fixed point and the object will be scaledin direction from that point.

BottomLeft - lower left corner is the fixed point and the object will be scaledin direction from that point.

BottomRight - lower right corner is the fixed point and the object will bescaled in direction from that point.

BottomRight

Center RightLeft

Bottom

TopTopLeft TopRight

BottomLeft

Figure 4.1: Different anchor point for the scale base effect.

The anchorPoint functionality is not implemented in the Visual Effects Engineprototype. The thought was to achieve anchor point behavior with the help ofa position base effect in combination with a scale base effect. This can not beguaranteed to work in all cases, because the compositor only support one animationof a certain property at a time. Meaning that it is not possible to do a positionbase effect at the same time as a scale effect with anchorPoint set to somethingelse than TopLeft, so the default and only option is TopLeft.

4.2.9 opacity

This is the opacity base effect tag and it is used to define an opacity animation within adefineEffect tag. The opacity base effect is composed of opacityKeyframe tags definingkeyframe values.

Page 50: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

40 Chapter 4. Visual Effects Markup Language

<opacity multiBehavior="Parallel" multiDelay="1ms">

...

</opacity>

The opacity tag has the following attributes:

multiBehavior - see multiBehavior in position tag.

multiDelay - see multiDelay in position tag.

4.2.10 positionKeyframe

The positionKeyframe tag defines a keyframe for the position base effect within a positiontag.

<positionKeyframe interpolator="EaseOut" time="0ms"

x="Source" y="Source" dx="0%" dy="0%" />

<positionKeyframe interpolator="EaseOut" time="100ms"

x="Destination" y="Destination" dx="0%" dy="0%" />

The positionKeyframe has the following attribute:

interpolator - defines in which way the interpolation to the keyframe will be carriedout. Supported values are:

Linear - this is the default interpolation method and it defines a linear inter-polation from the previous keyframe.

EaseOut - defines an eased out interpolation from the previous keyframe.

None - no interpolation is done which results in no animation from the pre-vious keyframe.

time - defines the time of a certain keyframe in a base effect. This attribute canhowever be left out where the effect can calculate the time using the speed of theeffect. The value of the time attribute can only be specified in milliseconds (ms)e.g. ”100ms” of 100 milliseconds.

x - defines an absolute position on the X-axis. The supported values are bothunit values and predefined fixed and dynamic positions. The supported unitsare millimeters (mm) and pixels (px) e.g. ”10mm” or ”20px”. The following ispredefined values which easily could be extended to make FXML more powerful:

Source - represent the current position of the object before the effect, this isalso the default position of the first keyframe.

Page 51: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

4.2. FXML Tags 41

Destination - depending on how the effect is triggered by the client, this canmap to the source position or toward a designated destination specified bythe client.NearestEdge - sets the position to nearest edge of the parent.LeftOfParent - sets the position outside the parent object on the left.RightOfParent - sets the position outside the parent object on the right.

y - defines an absolute position on the Y-axis in the same way as the x attributedoes on the X-axis. However, there are some differences in predefined values:

Source - same as the x attribute.Destination - same as the x attribute.NearestEdge - same as the x attribute.AboveParent - sets the position above and outside the parent object.BelowParent - sets the position below and outside the parent object.

dx - defines a relative position on the X-axis from the last known position given,either by the x attribute of the current keyframe or from a previous keyframe. Sup-ported values are both unit values as millimeters (mm), pixels (px) and percentage(%) or predefined value:

Width - adds or subtracts the width of the current object to the position, forsubtraction a minus sign (-) is added in the front.

dy - defines a relative position on the Y-axis in the same way the dx attributedefines a relative position on the X-axis. However, there are some differences inthe predefined value.

Height - adds or subtracts the height of the current object to the position,for subtraction a minus sign (-) is added in the front.

4.2.11 scaleKeyframe

The scaleKeyframe tag defines a keyframe for the scale base effect within a scale tag.

<scaleKeyframe interpolator="Linear" time="0ms"

x="100%" y="100%"/>

<scaleKeyframe interpolator="Linear" time="100ms"

x="50%" y="50%"/>

The scaleKeyframe has the following attribute:

interpolator - see interpolator in positionKeyframe tag.

time - see time in positionKeyframe tag.

x - defines a scale factor in percentage (%) on the X-axis.

y - defines a scale factor in percentage (%) on the Y-axis.

Page 52: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

42 Chapter 4. Visual Effects Markup Language

4.2.12 opacityKeyframe

The opacityKeyframe tag defines a keyframe for the opacity base effect within an opacitytag.

<opacityKeyframe interpolator="Linear" time="0ms"

level="100%"/>

<opacityKeyframe interpolator="Linear" time="100ms"

level="0%" />

The opacityKeyframe has the following attribute:

interpolator - see interpolator in positionKeyframe tag.

time - see time in positionKeyframe tag.

level - defined which level of opacity in percentage (%), with 0% being transparentand 100% being opaque.

4.2.13 defineConstants

All constant values defined through the use of the constant tag and must be encapsulatedby a defineConstants tag.

<defineConstants>

...

</defineConstants>

This option is not implemented in the Visual Effects Engine prototype and suggestionon how the use of constants in FXML should be is appreciated.

4.2.14 constant

This tag defines an arbitrary string to any valid FXML value, this is useful when fuzzyvalues, such as Very Slow, Slow, Fast and Very Fast is to be defined.

<constant name="VeryFast" value="90mms" />

This tag is not implemented in the Visual Effects Engine prototype.

Page 53: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Chapter 5

Result and Conclusion

The goal with the thesis project was to define an architecture that could read visualeffects definitions from XML files and then apply them on graphical objects when eventsare invoked. The main result is that we managed to implement a working prototype, theVisual Effects Engine, where it is possible to define visual effects behaviors in FXML,which is our own language, defined in XML. These behaviors can then be fed into theVisual Effects Engine, which then will apply visual effects to graphical objects in theuser interface when events arrives. The Visual Effects Engine prototype demonstratesthat it is possible to have event-driven visual effects, defined with XML in a graphicalsystem used in a mobile phone.

However, this does not mean that everything is working perfectly and there is still workneeded to be done. For instance, it is fairly easy to say that the interfaces towards thearchitecture is limited and only solve specific problems. This might be the case, butwith the current resources available for controlling visual effects in the system and therequirements stated at the beginning, this is probably as good as it gets.

Performance is always a big issue with computer graphics, this was however not statedas a problem at the beginning of the project but it is still something that we are worriedabout. The result here was that the performance was really good, compared to havingstatic visual effects we could not make out any difference.

What made things complicated was the requirement on being able to relate severalobjects to one another, making visual effects behaving differently depending on thestate of these objects. Both the XML interface and the client interface have sufferedfrom this requirement and it had a great impact on how these interfaces ended up.

An important requirement on the XML interface was that is should be easy to learnand use, as the target users are designers. It is our opinion that FXML provides a verysimple but yet powerful way of defining visual effects and transitions, with only a fewlines of XML it is possible to define really advanced behaviors. Much of the power comesfrom all the special attribute values available, such as ”LeftOfParent” etc. After usingFXML for defining visual effects we have already found many possible special valuesthat would be desirable.

43

Page 54: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

44 Chapter 5. Result and Conclusion

On the question if this architecture would improve the development process of visualeffects, the answer is that compared with the current static way effects are managed,this is a big breakthrough.

We hope that the work put down into this thesis can function as a reference for othersimilar solutions that also uses XML to define visual effects.

5.1 Limitations

The Visual Effects Engine does not have any support for callbacks that notify clientswhen visual effects are done. Callbacks was more of a desire than a requirement andwas therefore not implemented.

The XML interface toward the Visual Effects Engine does not consider all possible visualeffects.

There is only three base effects, move, scale and opacity. The Visual Effects Engine willbecome more useful when the number of base effects increases.

5.2 Further Work

As the implemented solution is just a prototype there is still much work to be done.Here follows a list on things that probably should be implemented and evaluated.

– Integration - The architecture should be fully integrated into a mobile user inter-face.

– Callback - A callback when animations are done would be desirable when dealingwith real-time systems. The Visual Effects Engine should be able to notify clientsvia callback functions when an animation is done.

– Thread safeness - Some of the data in the architecture that could potentially begained access from different clients simultaneously must be mutex protected. Thisis important when clients exist in different processes.

– More predefines - Add more predefined special values in FXML, positions, sizesand orders of base effects.

– Path - It is desirable to be able to define a motion path animation in FXMLand combining this with some vector logic to be able to perform ”tricky” positionanimations in all directions.

– Keysplines - To define a set of control points in FXML, that impacts the interpo-lation of an animation.

– Fixed-point - The internal representation should probably be handled using fixed-point math. This was the intention from the start of the implementation phase,although, no existing utilities were found in the development environment, soinstead of having to implement an unoptimized solution it was left out.

Page 55: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Bibliography

[1] Sony Ericsson Mobile Communication. Webpage, October 2005.http://www.sonyericsson.com.

[2] The Mobile SVG Company. Webpage, February 2006. http://www.ikivo.com.

[3] World Wide Web Consortium. Webpage, January 2006. http://www.w3.org.

[4] X Composite Extension. Webpage, January 2006.http://www.freedesktop.org/Software/CompositeExt.

[5] Quartz Extreme. Webpage, February 2006.http://www.apple.com/macosx/features/quartzextreme/.

[6] Scalable Vector Graphics. Webpage, January 2006.http://www.w3.org/Graphics/SVG/.

[7] The Khronos Group. Webpage, January 2006. http://www.khronos.org.

[8] Ann Light. Ease of Use is Key to Uptake of Mobile Data Services, suggests Study.Usability News, 1(1):1, November 2005.

[9] Mobile Macromedia Flash and Devices. Webpage, February 2006.http://www.macromedia.com/mobile/.

[10] OpenGL. Webpage, February 2006. http://opengl.org.

[11] Mikael Persson and Karl-Anders Johansson. Opportunities and challenges when3D accelerating mobile user interfaces. Technical report, November 2005.http://www.ep.liu.se/ecp/016/011/ecp01611.pdf.

[12] BitFlash Mobile SVG Player and SDK. Webpage, February 2006.http://www.bitflash.com/.

[13] Timo Poropudas. Bitboys brings vector graphics for mobile devices. Nordic WirelessWatch, 1(1):1, March 2005.

[14] Thomas Porter and Tom Duff. Compositing Digital Images. Computer Graphics,18(3):253–259, July 1984.

[15] Kari Pulli. The Rise of Mobile Graphics. Technology In-Depth, 3(1):14–15, 2004.

45

Page 56: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

[16] The Synchronized Multimedia Integration Language (SMIL). Webpage, February2006. http://www.w3.org/AudioVideo/.

[17] Micro Edition (J2ME) Sun, Java 2 Platform. Webpage, February 2006.http://java.sun.com/j2me/.

[18] Microsoft Windows Vista. Webpage, January 2006.http://www.microsoft.com/Windowsvista/.

Page 57: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Appendix A

Acronyms

API Application Programming Interface

CPU Central Processing Unit.

IDL Interface Definition Language

FXML Visual Effects Markup Language, is a XML based markup language used bythe Visual Effect Engine, to describe visual effects, see Section 4.

GPU Graphics Processing Unit, a computer chip dedicated to graphical computation.

IDL Interface Definition Language

SAX Simple API for XML, is method to read data from a XML document.

SEMC Sony Ericsson Mobile Communication[1].

SVG Scalable Vector Graphics, see Chapter 2.

SVGT SVG Tiny, is the Tiny profile of SVG, see Chapter 2.

SVGB SVG Basic, is the Basic profile of SVG, see Chapter 2.

W3C World Wide Web Consortium, is an international consortium where memberorganizations, a full-time staff and the public, work together to develop Web stan-dards.

XML Extensible Markup Language, see Section 4.1.

47

Page 58: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen
Page 59: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Appendix B

IVisualEffectsEngine.idl

/*********************************************************************

* ____ _____ _ *

* / ___| ___ _ __ _ _ | ____|_ __(_) ___ ___ ___ ___ _ __ *

* \___ \ / _ \| ’_ \| | | | | _| | ’__| |/ __/ __/ __|/ _ \| ’_ \ *

* ___) | (_) | | | | |_| | | |___| | | | (__\__ \__ \ (_) | | | | *

* |____/ \___/|_| |_|\__, | |_____|_| |_|\___|___/___/\___/|_| |_| *

* |___/ *

* *

*********************************************************************

* Sony Ericsson Mobile Communications AB, Lund Sweden *

* Copyright 2005 *

* Alexander Klintstrom ([email protected]) *

* Andreas Larsson ([email protected] *

*********************************************************************/

/**

* @file

*

* Visual Effects Engine interface

*

* @author Alexander Klintstrom ([email protected])

* @author Andreas Larsson ([email protected]

*/

import "OPA_types.idl"

import "IRoot.idl"

import "WindowSystem_types.idl"

import "VisualEffects_types.idl"

[

uuid(2C14FC99-9C9E-415d-AE7C-BAC4F8E08792)

]

/**

*

49

Page 60: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

*/

interface IVisualEffectsEngine : IRoot

{

/**

* Register a client.

* @param [out] pClientID identification of the client.

* @returns Status of the operation

*/

RVoid

RegisterClient(

[out] TVisualEffectsClientID* pClientID);

/**

* Unregister a client.

* @param [in] clientID identification of the client.

* @returns Status of the operation

*/

RVoid

UnregisterClient(

[in] TVisualEffectsClientID clientID);

/**

* Register a effects for a client.

* @param [in] clientID identification of the client.

* @param [in] pDir The directory where current file is stored

* @param [in] pFile The name of the file

* @param [out] pVisualEffectsID The identifications to the effects

* @returns Status of the operation

*/

RVoid

RegisterEffectFile(

[in] TVisualEffectsClientID clientID,

[in] TChar* pDir,

[in] TChar* pFile,

[out] TVisualEffectsHandle* pVisualEffectsHandle);

/**

* Unregister a effects for a client.

* @param [in] clientID identification of the client.

* @param [in] windowEffectsID The identification to the effects

* @returns Status of the operation

*/

RVoid

UnregisterEffects(

[in] TVisualEffectsClientID clientID,

[in] TVisualEffectsHandle visualEffectsHandle);

/**

* Tigger (invokes) effects.

* @param [in] clientID identification of the client.

* @param [in] pVisualHandlers An array of window handlers.

* @param [in] nVisualHandlers Number of window handlers in array

* @param [in] activeVisualIndex The number in the array that is active

Page 61: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

* @param [in] object Which kind of object (window)

* @param [in] event The event that occurred

* @param [in] pEventData Additional data needed for a effect

* @returns Status of the operation

*/

RVoid

TriggerEffects(

[in] TVisualEffectsClientID clientID,

[in, size_is(nWindowHandles)] TWindowHandle* pWindowHandles,

[in] FUint8 nWindowHandles,

[in] FUint8 activeWindowIndex,

[in] TChar* pObject,

[in] TChar* pEvent,

[in] TVisualEffectsEventGroup eventGroup,

[in] TVisualEffectsEventData* pEventData);

/**

* Provides one clients effects and makes it possible for other

* clients to inherit those.

* @param [in] clientID Identification of the client

* @param [in] pLabel The label on clients effects

*/

RVoid

ProvideLabel(

[in] TVisualEffectsClientID clientID,

[in] TChar* pLabel);

/**

* Inherit one clients effects.

* @param [in] clientID Identification of the client

* @param [in] pLabel The label on clients effects

*/

RVoid

InheritLabel(

[in] TVisualEffectsClientID clientID,

[in] TChar* pLabel);

}

Page 62: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen
Page 63: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Appendix C

VisualEffectsEngine types.idl

/*********************************************************************

* ____ _____ _ *

* / ___| ___ _ __ _ _ | ____|_ __(_) ___ ___ ___ ___ _ __ *

* \___ \ / _ \| ’_ \| | | | | _| | ’__| |/ __/ __/ __|/ _ \| ’_ \ *

* ___) | (_) | | | | |_| | | |___| | | | (__\__ \__ \ (_) | | | | *

* |____/ \___/|_| |_|\__, | |_____|_| |_|\___|___/___/\___/|_| |_| *

* |___/ *

* *

*********************************************************************

* Sony Ericsson Mobile Communications AB, Lund Sweden *

* Copyright 2005 *

* Alexander Klintstrom ([email protected]) *

* Andreas Larsson ([email protected] *

*********************************************************************/

/**

* @file

*

* Visual Effects Handler types

*

* See \ref I_VISUALEFFECTSHANDLER_TYPES for detailed description.

*

* @ingroup I_VISUALEFFECTSHANDLER_TYPES

*

* @author Alexander Klintstrom ([email protected])

* @author Andreas Larsson ([email protected]

*/

import "OPA_types.idl"

53

Page 64: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

/**

* @defgroup I_VISUALEFFECTSHANDLER_TYPES VisualEffectsHandler types

*

* Types and defines used in VisualEffectsHandler methods.

*

*/

/**

* @addtogroup I_VISUALEFFECTSHANDLER_TYPES

* @{

*

*/

/**

* Handle to identify a set of effects, registerd with registerEffects.

*/

typedef TUnsigned TVisualEffectsHandle;

/**

* Identify a client, registerd with registerClient.

*/

typedef TUnsigned TVisualEffectsClientID;

/**

* Some events need extra data to work, one example is the move event that

* need a position to be able to move the window. For this to work some kind

* of data must be passed along with the event. Because of the dynamic way of

* defining events, a grouping mechanism is needed to group certain events as

* part of an event group.

*/

typedef enum

{

VISUAL_EFFECTS_EVENT_GROUP_NONE,

VISUAL_EFFECTS_EVENT_GROUP_MOVE,

VISUAL_EFFECTS_EVENT_GROUP_MAP,

VISUAL_EFFECTS_EVENT_GROUP_MAP_ON_POSITION,

VISUAL_EFFECTS_EVENT_GROUP_UNMAP,

VISUAL_EFFECTS_EVENT_GROUP_UNMAP_AND_DESTROY

} TVisualEffectsEventGroup;

/**

* Moves can either be absolute or relative.

*/

typedef enum

{

VISUAL_EFFECTS_POSITION_METHOD_ABSOLUTE,

VISUAL_EFFECTS_POSITION_METHOD_RELATIVE

} TVisualEffectsPositionMethod;

/**

* This is the data structure that is passed along with some events that is

* in need of extra information. The eventGroup indicates what kind of event

* that is handled. Currently the only special event that needs extra data is

Page 65: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

* the move event group. The information to the move event group is then

* pointed out by ".data.move".

*/

typedef union {

struct position_s {

TVisualEffectsPositionMethod method;

FSint32 x;

FSint32 y;

} position;

} TVisualEffectsEventData;

/**

* @}

*/

Page 66: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen
Page 67: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Appendix D

Visual Effects DTD

<?xml version="1.0" encoding="UTF-8"?>

<!--

Visual Effects DTD

Authors: Andreas Larsson <[email protected]>

Alexander Klintstr~A¶m <[email protected]>

This DTD defines the XML interface towards the Visual Effects Engine,

that is a prototype for handling visual effects and transitions on objects

in the user interface.

-->

<!--

The fxml element is the root element in an effect definition XML file,

fxml is short for Visual Effects Markup Language when Visual Effects often

are shorted down to just FX. The fxml element require the version attribute

to be set, the only version at this time is the 1.0 version.

-->

<!ELEMENT fxml (include|object|effectDefinition) >

<!ATTLIST fxml

version (1.0) #REQUIRED

>

<!--

With the include element it is possible to split up the effect file in

several files. When parsing a effect file and the include tag is

encountered, the parser will pause the parsing of the current file and start

parse the included one recursively, when done it will continue parse the

first file.

-->

<!ELEMENT include (#PCDATA) >

<!--

57

Page 68: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

Visual Effects Mappings

-->

<!--

The object element contain properties for a certain object, pointed out

by the name attribute. Current supported properties are events, through

the event element.

-->

<!ELEMENT object (event*) >

<!ATTLIST object

name CDATA #REQUIRED

>

<!--

The event element is located inside an object element and contain

information of what to do on a certain event, the type of event is pointed

by the name attribute. Current supported happenings are effects through the

effect element.

-->

<!ELEMENT event (effect*) >

<!ATTLIST event

name CDATA #REQUIRED

>

<!--

The effect element is located inside an event element and defines which

effect to apply, pointed out by the name attribute. The name attribute can

be an arbitrary string, however it must be defined in a defineEffect element

in an effect file loaded in the Visual Effects Engine.

-->

<!ELEMENT effect EMPTY >

<!ATTLIST effect

name CDATA #REQUIRED

>

<!--

This element defines an effect by a combination of base effects. The name

of the defined effect is pointed out by the name attribute. Current

supported base effects are position, scale and opacity which is also the

name of the corresponding element. Each base effect can only be used once

in a effect definition.

-->

<!ELEMENT defineEffect (position?|scale?|opacity?) >

<!ATTLIST defineEffect

name CDATA #REQUIRED

>

<!--

BASE EFFECTS

The current supported base effects are position, scale and opacity.

Page 69: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

They all have these common attributes:

- multiBehavior, defines how the effect would be carried out when the effect

is triggered on several windows. Current defined behaviors

concentrating on different ordering of the objects:

- Parallel, the effect is carried out in parallel the multiDelay attribute

is disregarded.

- TopToBottom, the effect is carried out from top to bottom in the given

order with a delay in between, defined by the multiDelay attribute.

- BottomToTop, opposite to the TopToBottom behavior.

- ActiveFirst, the object marked as active receives the effect first

continuing with the objects before and after in the order and so fourth.

A delay is used in between defined by the multiDelay attribute.

- Random, the effect is carried out on objects in a random order with a

delay in between, defined by the multiDelay attribute.

- multiDelay, defines a time in milliseconds (ms) that objects will be

delayed when the effect is carried out in an order other than parallel.

-->

<!--

This is the position base effect element and is used to define an position

animation within a defineEffect element. The position base effect is built

up by positionKeyframe elements defining keyframe values. The position

element have four attributes where two of them are common attributes for

all base effects, these attributes are described above. The two remaining

attributes are speed and path:

- speed, defines the velocity of an object when the position effect is

carried out, this is only used when the positionKeyframe element is used

without the time attribute. The time value for such a keyframe is then

calculated using the distance and the speed. The speed attribute can be

specified in either pixels per second (pxs) or in millimeters per second

(mms). It is also desirable to be able to specify fuzzy values like Fast

or Slow etc. However this is not handled in the Visual Effects Engine

prototype.

- path, defines how the path should be routed. Current supported values are

Straight and Shortest. Straight is the straight way from one point to

another and Shortest is shortest way where it is possible to cross over

from one side of the screen and enter on the opposite side.

-->

<!ELEMENT position (positionKeyframe+) >

<!ATTLIST position

multiBehavior (Parallel|TopToBottom|BottomToTop|ActiveFirst|Random) "Parallel"

multiDelay CDATA #IMPLIED

speed CDATA #IMPLIED

path (Straight|Shortest) "Straight"

>

<!--

This is the scale base effect element and is used to define a scale

animation within a defineEffect element. The scale base effect is built up

by scaleKeyframe elements defining keyframe values. The scale element have

four attributes where two of them are common attributes for all base effects,

these attributes are described above. The two remaining attributes are speed

and anchorPoint:

Page 70: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

- speed, defines the speed used when scaling an object, the distance is then

calculated as the difference in size between the two keyframes.

This is only used when the positionKeyframe element is used without the

time attribute set. The time value for such a keyframe is then calculated

using the distance and the speed. The speed attribute can be

specified in either pixels per second (pxs) or in millimeters per second

(mms). It is also desirable to be able to specify fuzzy values like Fast

or Slow etc. However this is not handled in the Visual Effects Engine

prototype.

- anchorPoint, defines in which direction the scale effect will be carried

out. The default is TopLeft and represent the top left corner of the

object, this is also the only supported behavior in the Visual Effects

Engine prototype. This limitation is because with the compositor you have

to do a scale in combination with a position effect to achieve this

behavior.

Although, the compositor only support one animation of a certain property at the

time which means that it is not possible to do a position base effect

at the same time as a scale effect with anchorPoint set to other than

TopLeft.

-->

<!ELEMENT scale (scaleKeyframe+) >

<!ATTLIST scale

multiBehavior (Parallel|TopToBottom|BottomToTop|ActiveFirst|Random) "Parallel"

multiDelay CDATA #IMPLIED

speed CDATA #IMPLIED

anchorPoint (Left|Right|Top|Bottom|Center|TopLeft|TopRight|BottomLeft|BottomRight) "Center"

>

<!--

This is the opacity base effect element and is used to define a opacity

animation within a defineEffect element. The opacity base effect is built up

by opacityKeyframe elements defining keyframe values. The opacity element

have two attributes which both are common attributes for all base effects

and are described above.

-->

<!ELEMENT opacity (opacityKeyframe+) >

<!ATTLIST opacity

multiBehavior (Parallel|TopToBottom|BottomToTop|ActiveFirst|Random) "Parallel"

multiDelay CDATA #IMPLIED

>

<!--

BASE EFFECT KEYFRAMES

The common attribute in all keyframe elements are the interpolator and the

time attribute:

- interpolator, defines in which way the interpolation to the keyframe will

be carried out. Supported values are:

- Linear, this is the default interpolation method and defines a linear

interpolation from the previous keyframe.

- EaseOut, defines a eased out interpolation from the previous keyframe.

- None, no interpolation is done which result in no animation from the

previous keyframe.

Page 71: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

- time, defines the time of a certain keyframe in a base effect. This

attribute can however be left out in the position and the scale base

effect where those effects can calculate the time using the speed of the

effect. The value of the time attribute can only be specified in

milliseconds (ms) e.g. "100ms" of 100 milliseconds.

-->

<!--

The positionKeyframe element defines a keyframe for the position base

effect within a position element. The positionKeyframe element have six

attributes where two of them are common attributes for all keyframes and

are described above. The other four attributes are:

- x, defines an absolute position on the X-axis. The supported values are

both unit values and predefined fixed an dynamic positions. The supported

units are millimeters (mm) and pixels (px) e.g. "10mm" or "20px". The

predefined values are:

- Source, represent the current position of the object before the effect,

this is also the default position of the first keyframe.

- Destination, depending on how the effect is triggered by the client this

can map to the source position or toward a designated destination

specified by the client.

- LeftOfParent, sets the position outside the parent object on the left.

- RightOfParent, sets the position outside the parent object on the right.

- y, defines an absolute position on the Y-axis in the same way as the x

attribute does on the X-axis. However, there are some differences in

predefined values:

- Source, same as the x attribute.

- Destination, same as the x attribute.

- AboveParent, sets the position above and outside the parent object.

- BelowParent, sets the position below and outside the parent object.

- dx, defines an relative position on the X-axis from the last known

position given, either by the x attribute of the current keyframe or from

a previous keyframe. Supported values are both unit values as millimeters

(ms), pixels (px) and percentage (%) or fixed values:

- Width, adds or subtracts the width of the current object to the

position, for subtraction a minus sign (-) is added in the front.

- dy, defines a relative position on the Y-axis in the same way the dx

attribute defines a relative position on the X-axis. However, there are som

differences in the fixed values.

- Height, adds or subtracts the height of the current object to the

position, for subtraction a minus sign (-) is added in the front.

-->

<!ELEMENT positionKeyframe EMPTY >

<!ATTLIST positionKeyframe

interpolator (Linear|EaseOut|None) "Linear"

time CDATA #REQUIRED

x CDATA #IMPLIED

y CDATA #IMPLIED

dx CDATA #IMPLIED

dy CDATA #IMPLIED

>

Page 72: Visual Effects Management in a Mobile User · PDF fileVisual Effects Management in a Mobile User Interface ... 3.1 Illustration of the layering in the graphics system ... screen

<!--

-->

<!ELEMENT scaleKeyframe EMPTY >

<!ATTLIST scaleKeyframe

interpolator (Linear|EaseOut|None) "Linear"

time CDATA #REQUIRED

x CDATA #REQUIRED

y CDATA #REQUIRED

>

<!--

-->

<!ELEMENT opacityKeyframe EMPTY >

<!ATTLIST opacityKeyframe

interpolator (Linear|EaseOut|None) "Linear"

time CDATA #REQUIRED

level CDATA #REQUIRED

>

<!--

Unsupported Elements

This elements are unsupported by the Visual Effects Engine prototype,

however they are an important feature for a complete solution and is

therefor included in this DTD as an discussion base.

-->

<!--

This element is used to define constants that will be used in the

Visual Effects Engine when effects are triggered. This element is however

NOT used in the Visual Effects Engine prototype, although it is an important

feature in a final solution. The intention was that is should be possible to

assign arbitrary values to arbitrary constants by the constant element.

-->

<!ELEMENT defineConstants (constant+) >

<!--

This element assign a value to a constant, both the value and the constant

are arbitrary strings. This element must be located inside a definesConstant

element. This element is NOT handed by the Visual Effects Engine prototype.

-->

<!ELEMENT constant EMPTY >

<!ATTLIST constant

name CDATA #REQUIRED

value CDATA #REQUIRED

>


Recommended