+ All Categories
Home > Documents > Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in...

Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in...

Date post: 25-Mar-2020
Category:
Upload: others
View: 3 times
Download: 0 times
Share this document with a friend
10
Virtual Reality for Lighting Simulation in Events Jo˜ ao Cintra Torres Reis Ara´ ujo [email protected] Instituto Superior T´ ecnico, Lisboa, Portugal May 2014 Abstract Lighting in events such as concerts, theatres and nightclubs is progressing massively. The technol- ogy involved is what enables the highly intense and powerful shows to stand out. In order to ensure great performances, technicians are now using light simulation tools to design, program and preview their shows beforehand. Recent applications con- sist of elaborate interfaces with complex tools and render engines to provide the most visual feedback in advance, avoiding production costs and stress prior to an event. However, these interfaces tend to be overwhelmingly sophisticated, requiring con- siderable learning efforts. Also, and despite the advancements on the visualization side, there is a lack of immersive 3D interaction capabilities, which could provide a far more realistic user experience. This work tackles this challenge. We propose an immersive light design and pre-visualization interface, aiming to increase the preview realism and suit people with no experience in stage design. Our prototype, VRLight, couples a head-mounted display and a gesture-based interface for visual- ization and real-time interactive light simulation. The design and control tasks are split in order to increase creativity and focus on both sides. In the immersive environment, the director performs the design and pre-visualization routines, and complex control is externally carried out by the technician, using any light console of his preference. To validate this solution, a group of non-expert users was involved in a set of tests. Results shown that users with no knowledge prior to the evaluation could easily perform stage design tasks. Keywords: VRLight, virtual reality, immer- sive visualization, stage lighting design, events. 1. Introduction Light simulation is becoming a necessity in numer- ous industries. Fields such as architecture or prod- uct design now require imperative tests over illumi- nation in their products, where lighting designers mistakes are no longer tolerated. Event lighting is no exception. In fact, concerts, theaters, festivals or nightclubs require hundreds of light fixtures to provide the intense shows we can see nowadays. In event lighting, the existing software to pre- program and pre-visualize the lighting behavior consists of powerful rendering applications with an abundant set of tools to fulfil the designers needs. However, these tools consist of complicated user in- terfaces that require massive learning efforts and time consuming tasks. Furthermore, new interface possibilities such as immersive 3D environments and gesture-based devices have yet to be tested. There are three main concerns in event light- ing production: stage design, light pre-visualization and control. The existing software helps designers to achieve their goals, but in a very complex and time-consuming way. All the steps involved require a lot of interface knowledge and the wishful level of abstraction between light programming and the creative design of the show is somewhat demoraliz- ing. On the other hand, with the light programming stage complete, pre-visualization tools now provide big help, but still in flat 3D images in 2D screens. This project proposes an immersive virtual real- ity (VR) solution for stage lighting simulation, us- ing a head-mounted display (HMD), which enables an insider point of view of the simulation, never tested in stage design. Together with gesture-based devices for interaction, this approach is proposed to deliver a more intuitive and realistic preview, re- ducing the learning effort with a simpler interface, in an environment with free movement capabilities for the best preview of the light show. Throughout the following sections we present an overview of the state-of-art in lighting simulation applications, light control, user interfaces for light- ing and music and immersive visualization and in- teraction technology. We follow with a detailed de- scription of the proposed solution in detail. Then, a non-expert user validation is presented, using data collected from test sessions. Finally, we present an overall discussion of our work, delineating conclu- 1
Transcript
Page 1: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

Virtual Reality for Lighting Simulation in Events

Joao Cintra Torres Reis [email protected]

Instituto Superior Tecnico, Lisboa, Portugal

May 2014

Abstract

Lighting in events such as concerts, theatres andnightclubs is progressing massively. The technol-ogy involved is what enables the highly intense andpowerful shows to stand out. In order to ensuregreat performances, technicians are now using lightsimulation tools to design, program and previewtheir shows beforehand. Recent applications con-sist of elaborate interfaces with complex tools andrender engines to provide the most visual feedbackin advance, avoiding production costs and stressprior to an event. However, these interfaces tendto be overwhelmingly sophisticated, requiring con-siderable learning efforts. Also, and despite theadvancements on the visualization side, there is alack of immersive 3D interaction capabilities, whichcould provide a far more realistic user experience.

This work tackles this challenge. We proposean immersive light design and pre-visualizationinterface, aiming to increase the preview realismand suit people with no experience in stage design.Our prototype, VRLight, couples a head-mounteddisplay and a gesture-based interface for visual-ization and real-time interactive light simulation.The design and control tasks are split in order toincrease creativity and focus on both sides. In theimmersive environment, the director performs thedesign and pre-visualization routines, and complexcontrol is externally carried out by the technician,using any light console of his preference. Tovalidate this solution, a group of non-expert userswas involved in a set of tests. Results shown thatusers with no knowledge prior to the evaluationcould easily perform stage design tasks.

Keywords: VRLight, virtual reality, immer-sive visualization, stage lighting design, events.

1. Introduction

Light simulation is becoming a necessity in numer-ous industries. Fields such as architecture or prod-uct design now require imperative tests over illumi-nation in their products, where lighting designers

mistakes are no longer tolerated. Event lighting isno exception. In fact, concerts, theaters, festivalsor nightclubs require hundreds of light fixtures toprovide the intense shows we can see nowadays.

In event lighting, the existing software to pre-program and pre-visualize the lighting behaviorconsists of powerful rendering applications with anabundant set of tools to fulfil the designers needs.However, these tools consist of complicated user in-terfaces that require massive learning efforts andtime consuming tasks. Furthermore, new interfacepossibilities such as immersive 3D environmentsand gesture-based devices have yet to be tested.

There are three main concerns in event light-ing production: stage design, light pre-visualizationand control. The existing software helps designersto achieve their goals, but in a very complex andtime-consuming way. All the steps involved requirea lot of interface knowledge and the wishful levelof abstraction between light programming and thecreative design of the show is somewhat demoraliz-ing. On the other hand, with the light programmingstage complete, pre-visualization tools now providebig help, but still in flat 3D images in 2D screens.

This project proposes an immersive virtual real-ity (VR) solution for stage lighting simulation, us-ing a head-mounted display (HMD), which enablesan insider point of view of the simulation, nevertested in stage design. Together with gesture-baseddevices for interaction, this approach is proposedto deliver a more intuitive and realistic preview, re-ducing the learning effort with a simpler interface,in an environment with free movement capabilitiesfor the best preview of the light show.

Throughout the following sections we present anoverview of the state-of-art in lighting simulationapplications, light control, user interfaces for light-ing and music and immersive visualization and in-teraction technology. We follow with a detailed de-scription of the proposed solution in detail. Then, anon-expert user validation is presented, using datacollected from test sessions. Finally, we present anoverall discussion of our work, delineating conclu-

1

Page 2: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

sions and introducing perspectives for future work.

2. Related WorkLighting simulation was developed to fulfil the needto preview the effect of light in several environ-ments. Having an idea of what the resulting il-lumination will become in certain industries, suchas architecture or product design, made profession-als more aware and prevented lighting issues in-volved in the design process [1]. These are verygood news regarding the avoidance of expenses in-volved in lighting design mistakes.

To correctly understand stage design pre-visualization and what is missing in visualizationand interaction capabilities in this field, it is im-portant to cover all lighting related topics, such aslight control, existing lighting simulation applica-tions and user interfaces, alongside with the immer-sive solutions yet to be explored.

2.1. Light controlLight control developments are growing fast. Nowa-days, illumination control is provided in many dif-ferent ways - regular light switches, dimmers, sen-sors, timers and so forth. Studies into new waysof light interaction [2] and automation are increas-ingly frequent in several of industries - e.g. hotels[3]. With the appearance of new types of lightingand controllers, researchers are now investigatingnew possibilities in light interaction. Magielse andOffermans [4] studied new ways of giving freedomof control to the user in a comprehensive manner,using tangible and multi-touch interfaces. Eventlighting although, is by far the most complex inthe light control business, and the endless light fix-ture controllable parameters require a standard ap-proach, based on the DMX-512 (DMX) communi-cation protocol. DMX is a standard that describesa method of digital data transmission between con-trollers and lighting equipment and accessories.

Light consoles are the most common controllersavailable nowadays and DMX software solutions arenow taking over in some cases. These consoles cancommand dimmers which change the intensity ofthe lights. Modern consoles can also control intel-ligent lighting (lights that can move, change colorsand gobo patterns1), fog machines and hazers, etc.In event pre-visualization, the existing tools can re-ceive DMX inputs to allow the use of the final con-soles in the simulation control.

2.2. Lighting simulation applicationsLighting simulation is common in industries suchas architecture or product design. These however,are easier to preview since mainly require static pre-rendered graphics. Shikder [5] studied the perfor-

1Gobo - pattern inserted in the focal plane of a luminaireto project an image.

mance and realism of four lighting simulation toolsfor ambient light in static environments - DIALux,Relux, AGI32 and Radiance. In spite of the realis-tic results, real-time light rendering is not possiblewith the software mentioned.

In stage lighting pre-visualization, pre-renderedgraphics are not an option, due to the endless il-lumination possibilities available in the equipmentinvolved - colors, intensities, directions, effects andso on. Previewing the effect of lighting on a particu-lar event, with real-time capabilities, is now possiblewith specialized software - Figure 1 illustrates a mu-sic concert pre-visualization and final show. Lightdesigners now have the tools to pre-program a showbefore arriving at the venue - a process that can re-duce production costs, on-site time and stress, whilenurturing creativity.

Figure 1: U2 360 World Tour pre-visualization andfinal show combined.

2.3. Virtual lighting applications for eventsimulation

When analysing the main existing tools for pre-visualization, there are three that stand out fromthe rest - ESP Vision, WYSIWYG and grandMA3D. These applications provide tools from stage de-sign to real-time lighting pre-visualization. Theconcert in Figure 1 was developed in WYSIWYG,using the computer-aided design (CAD) internaltools for the whole structure, and previewed in real-time with a real console sending DMX inputs, afteradding all light fixtures and mapped all the chan-nels to the console.

This type of software is capable of designing ashow from scratch and solve almost any challengein light production nowadays with powerful real-ism settings. However, the skills required to workwith these interfaces, along with the time needed toachieve good results, make the simulation stage anintimidating and sometimes optional process.

2.4. User interfaces for lighting and music inevents

User interfaces for lighting control started withhardware only, with the consoles introduced earlierin this section. Recently, new software tools came

2

Page 3: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

to support, or even end the lighting desks tasks,by providing new interfaces and far more powerfulways to interact with lights, such as multi-touchcontrollers. Most hardware lighting consoles wereemulated, which is the case of Martin’s grandMAconsoles, one of the world’s top solutions, now witha virtual alternative called grandMA onPC. Multi-touch light consoles were also included in Smith-sonMartin’s2 Emulator - the only touch-screen pro-gram that can merge the software to be controlledwith the touch interface making a software feel likeit was natively designed for use with multi-touch. Inrelated businesses such as music or VJing, multi-touch solutions are also emerging [6, 7], althoughsome studies reveal higher levels of performancewith tangible controls [8, 9] - e.g. faders, buttons.

In spite of all the new control and interface op-tions, DMX is still the standard communication forlights. Regarding any pre-visualization software, allaccepting DMX, confirms that after great learningin a specific console, technicians seem to only trusttheir equipment.

2.5. Immersive visualization and interaction

With powerful realism settings, endless light controlsolutions, and guaranteed DMX support by any pre-visualization software, immersive visualization andinteraction approaches are still missing, yet usingregular screens and WIMP3 interaction solutions.Immersive VR or gesture-based devices are used inseveral businesses but only few have been tested instage lighting pre-visualization.

Visualization in VR environments can be broadlydivided into two types: non-immersive and immer-sive. The immersive is based on the use of HMDsor projection rooms - e.g. HMDs [10], CAVEs [11]- while the non-immersive VR based on the use ofmonitors. The notion of immersion, is based onthe idea of the user being inside the environment.HMDs can provide this insider feeling, which de-liver an even more realistic perception of a simu-lation, and were never tested in stage lighting pre-visualization.

To support the immersion, post-WIMP interfacessuch as gesture-based devices or hands-free interac-tion came to provide new ways to interact in vir-tual environments. Gesture-based devices are con-trollers that act as an extension of the body sothat when gestures are performed, some of theirmotion can be conveniently captured by software.Mouse gestures are the most well-known exam-ple. However, devices like the Nintendo Wiimote,Sony PS Move or SpacePoint Fusion, use gyroscopesand accelerometers to allow straightforward directmapping between device movements and rotations

2SmithsonMartin - http://www.smithsonmartin.com/3WIMP - Windows Icons Menus Pointer

and corresponding effects on the three-dimensionalspace. With hands-free interaction though, user’sare not dependent of an hardware extension of theirbody to navigate or interact in the environment.Using depth sensors, infrared cameras or micro-phones, devices like Microsoft Kinect for full bodytacking or Leap Motion for the hands can providenew degrees of freedom to the user.

Leap motion’s capabilities were already tested ina stage lighting environment by Apostolellis et al.[12] in an attempt to prove that it would outperformthe mouse for the integral tasks of position and rota-tion of light fixtures. However, he did not supportthe hypothesis with the mouse performing signifi-cantly better, both in terms of completion time andangular and position errors.

In conclusion, there are powerful lighting simula-tion tools for event pre-visualization but the learn-ing curve is overwhelming, mostly due to interfacecomplexity. On the other hand, great interfaces areemerging for light control, such as multi-touch de-vices, but the main concern in pre-visualization isto accept any hardware or software console usingthe DMX communication protocol. With this op-tion, every event light controller is capable to per-form in the preview stage. Finally, visualizationstill relies on regular screens based on WIMP, leav-ing immersive visualization and interaction yet tobe explored.

3. VRLightVRLight is a prototype software developed from theinvestigation on existing lighting pre-visualizationtools, immersive visualization devices and gesture-based interaction approaches. This virtual realitysolution for lighting simulation in events combinesall these new technologies, with the two main high-lights being simplicity and interaction.

3.1. Conceptual descriptionThis prototype separates the artistic task fromlight control, allowing the director to focus only onlight fixture choosing, placement and aiming in theimmersive environment. The entire light controlbackup is provided by a light technician, workingwith any light controller device of his or her pref-erence. Figure 2 shows an concept overview of thework process.

Using a head-mounted display, the designer getsan immersive view of the virtual environment whereall the action is taking place. With a pre-modeledscene ready - e.g. theater, club, etc - the userhandles a gesture-based device to allow movementand light editing. By communicating with the lighttechnician, who follows the process on a separatescreen, the director can demand any light parame-ter change to obtain real-time feedback of the setupbehavior and make fixture changes or aim lights in

3

Page 4: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

Figure 2: Conceptual overview.

different directions, all in real-time and with freemovement in the scenery. This unique approachlets the user intuitively experience the light showfrom any location, from either the crowd’s perspec-tive, the musicians’ or any other desired view angle.

3.2. Architecture descriptionVRLight was developed in Unity3D game enginethanks to its powerful 3D environment for modelingand the script-based logic for all operations, whichallowed a great level of abstraction and powerfulreal-time rendering. The application runs on a sin-gle computer and connects to an input module forinteraction and light control and an output modulefor visualization in two different devices (Figure 3).All light fixture resources are stored in a data foldercontaining the light modules and their behaviors.Also, a config file is used for both loading and sav-ing the lighting rig for further work. This file keepsthe fixture list, DMX channels and light orientationin the environment.

Input module is divided in two different com-ponents - interaction and light control - carriedout by a light designer (director) and a light tech-nician respectively. To interact in the environment- light editing menus and user movement - the de-signer uses the SpacePoint Fusion. This gesture-

Head-Mounted Display

Monitor

Input Module

Light Control

Wiimote

Light Console

Tablet

Interaction

SpacePointFusion

VRLight

Output Module

App Data

Figure 3: Architecture overview.

based device was chosen thanks to its very stablegyroscope component which allows one to add a cur-sor in the 3D environment using the pitch (X-axis),yaw (Y-axis) and roll (Z-axis) values. As far as lightcontrol is concerned, VRLight is ready to connectto any controller having the DMX communicationprotocol. From simple DMX software running inthe same machine to complex light consoles with aDMX-to-USB device or wireless connection to thecomputer, the application is ready to read any de-sired input. DMX and the detailed connection toexternal controllers is explained in section 3.4.

Output module provides visualization plat-forms to the designer and the technician. It is com-posed by the Oculus Rift HMD for immersive vi-sualization and an external monitor for the lighttechnician to acquire feedback from the simulation.The headset provides an insider view of the scene,with the immersive feeling allowing head movementin any direction or axis. The monitor displays a va-riety of views to the scene, allowing the technicianto follow the preview from the most suitable angle.

3.3. Immersive visualization and interaction

VRLight provides a unique experience when itcomes to lighting pre-visualization. Immersing ina modeled scenery and previewing the light witha VR headset lets light designers step inside theirvenue, test the lights and ensure a perfect perfor-mance as if they had been there before.

Visualization is achieved with the Oculus Riftheadset, connected to the core using the Unity3Dplug-in from the Oculus SDK. Due to the danger ofhitting something while using this device, it is im-possible to wear it and walk around the room, sincean HMD doesn’t allow to see other than the virtualenvironment. For this reason, the light designer in-teracts while sitting next to the light technician.Although this may seem to impose limitations inwhat concerns movement inside the scene, the userhas two different options when using the SpacePointgyroscope, explained in section 3.8.

Interaction in the immersive environment in-volves user movement, light fixture selection andaiming - e.g. select and aim different spotlights tothe elements of a rock band and preview the resultsfrom several perspectives. The SpacePoint is usedas a pointer controlling a cursor drawn in the scenefor the three tasks - different cursor models for eachtask, explained in section 3.7. The cursor position isobtained by shooting a raycast from the user’s posi-tion with the SpacePoint’s forward vector direction.In the hit point, the cursor is drawn. By default,the device’s zero orientation is to the North, mean-ing that if the user is sitting facing South, the cur-sor is still pointing North when the program starts.To correct this issue, a calibration is automatically

4

Page 5: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

made by adding the needed rotation to the Space-Point object, every time a cursor is needed.

3.4. Connection to external light controllers

Light control in show business is the process ofchanging the light behavior, depending on severalfactors such as the course of a theater scene or mu-sic animation. Regarding the thousands of hard-ware and software options available for light controlnowadays, which may depend on light technicians’preference or the scenery’s complexity, our proto-type soon showed the need to accept the universalcommunication system for light control - DMX, thestandard protocol in stage lighting and effects.

Nowadays, DMX can be used to control both realand virtual lighting rigs (Figure 4) which opened upnew possibilities for software applications. Virtualconsoles are now an option for light control and thedesign and preview tasks can now be supported bythe chosen hardware or software to be used furtherin the shows.

Before focusing on light controllers and the stagedesign process, it is important to understand howDMX works in pro lighting nowadays. A light fix-ture contains one or more controllable parameters,all listening to a different, but consecutive DMXchannel. The only thing that can be setup in thelight equipment is the first channel to be readingfrom - which will control the first light function.The fixture will then set the next channels to thefollowing parameters. In a light console, typically,the leftmost channel is the zero channel.

In VRLight, when adding or editing a light fix-ture, the user can also select the first channel to bereading from. In the light editing menu - explainedfurther in section 3.7 - in spite of choosing only onechannel, the label output always shows the first andlast channels to inform how many parameters con-trol each fixture. This is one of the most convenientfeatures for stage light design, since the usual stressprior to an event is reduced with the entire DMXmapping being setup before the show.

3.5. Real time light control

During light design, the ability to perform lightchanges to obtain feedback is a big advantage. Thisfeature is even more interesting if both design andcontrol are made at the same time. With real-timelighting preview, the director can confirm the final

Figure 4: DMX communication.

behavior while selecting a fixture’s position or aim-ing a spotlight.

In VRLight, real-time light control is achieved byreading external DMX inputs into the scene. Lightsbeing inserted in the environment are automaticallyfetching data from the DMXReceiver. The scriptstays active during the whole interaction, ensuringthat even in light editing mode, the light techniciancan send any DMX input to the scene, if asked bythe director. With this ability, the design can carryon without regular stops to preview lighting behav-iors in the current positions or settings. This optionremoves any independent processes for light editingor preview, since they all run at the same time.

3.6. Virtual stage environment

VRLight works with pre-modeled sceneries - the-ater, tv set, nightclub, bar, etc - and provides avariety of stage light fixtures to be installed in thevirtual venue. For prototyping purposes, the stagemodel was thought to be simple and intuitive forthe user, not disregarding the realistic features. Itconsists in an open-air structure with three centralbars (metal cylinders) for light fixtures, represent-ing the usual stage lighting truss. The stage modelis illustrated in Figure 5.

On the stage floor, a rock band stands over thewooden texture to serve as an example model of ashow demanding appropriate lighting. The initialenvironment shall contain this and every asset ofthe final show so, with all the models placed, light-ing will be the only concern for the light designer.To complete the environment, other common ele-ments were added to provide the most realistic feel-ing. The crowd facing the stage can provide lightfeedback such as how light illuminates the audienceor how would it be the light show seen from the cen-ter of the crowd. The model is finally enriched withtypical elements such as speaker arrays, stage mon-itor speakers and crowd barriers. All these assetswill diffuse and/or reflect the light, making themessential to a successful design.

VRLight’s pre-modeled stage allows light fixturesin nine predefined positions, called slots. A slot is

Figure 5: VRLight’s stage model.

5

Page 6: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

the parent object of the light fixture. When select-ing new equipment in the light editing menu, a newinstance of the fixture object is added as a slot child,and the slot stores all related options, such as thelight editing menu and data for the correspondingfixture. In this prototype, each of the structure’scentral bars contains three slots. These slots arethe starting point for light design in the prototype.

3.7. Light design

Stage lighting has endless fixture options nowadays.From light projectors to lasers, with single behav-ior or multiple effects, there is a solution to almostany design need. Our prototype contains a shortbut representative group of light models ready to beused but, regarding the countless options, it is scal-able, accepting new fixtures in a very easy process.Adding the 3D model, materials, behavior scriptand textures to the resources folder, automaticallypresents the new fixture in VRLight’s light editingmenu list.

Five different light fixtures were chosen to repre-sent the main lighting solutions available nowadays- labeled in Figure 6. These five fixtures representdifferent types of solutions used in stage lightingnowadays. They cover light effects like flat lightbeams, strobing lights, lasers and shaped or mov-ing beams. Regarding the visual feedback for eachluminaire, different approaches were taken, depend-ing on the unit. Fixtures containing defined lightbeams - Par LED, Gobo Projector and Mac 250were developed using volumetric light shaders tocreate the light beam effect. An animated textureadds the smoke/fog animation which helps turningthe beam visible. Changing at a constant speed,the texture creates the wind effect on the smoke hitby the beam, adding realism to the visualization.Color, strobing and intensity effects were addressedaccessing the fixture’s light materials and changingthem according to the DMX value read from theDMXReceiver script.

Virtual fixtures contain a behavior script at-tached which runs for as long as the equipment isinstalled in a structure slot. The script contains allthe information needed to provide the unit’s oper-ations, translating DMX values in virtual lightingaction. At every program cycle, if a new DMX in-put is received in the DMXReceiver, all the lightparameters are updated by the script.

Figure 6: VRLight’s fixtures.

Light fixture choosing, channel selection and lightaiming are all done using the light editing menu(Figure 7). This is called by left clicking the Space-Point device and selecting the corresponding bluesphere attached to the desired slot. A simple in-terface is provided in the light editing menu. Withvery few options, the susceptible lack of editablefeatures is due to the DMX protocol, responsible forall fixture’s behaviors in pro lighting. This way, anylight parameter other than aiming - how the equip-ment is installed in the structure to point in certaindirection - is controlled in the console. The menuended up needing only 6 buttons for three main sec-tions - fixture selection, light aiming and menuclosing. To aim a light fixture, a shooting tar-get cursor is provided, as a metaphor for where thelight will shoot its beam. Together with the cursor,the corresponding fixture’s light beam(s) will followwhere the user is pointing in the scenery.

3.8. User movement

One of the most important pre-visualization goalsin every industry is to provide as many preview an-gles as possible. Enabling total control from whichposition to look to an object, drawing, lighting en-vironment or any other is the key to a successfulpreview. In VRLight, since user immersion lets thedesigner step into the virtual environment with anheadset to look in every direction, it was crucial toallow full movement inside the scene. SpacePointgesture-based device was the solution, deliveringtwo different movement approaches - the ChangePosition Menu and Joystick Mode.

Using the SpacePoint device, the Change Posi-tion Menu is accessed by clicking the right buttonand closed by clicking anywhere outside of it withany button. This menu (Figure 8) is split in twosub-menus. The left menu - Movement - containsthree movement options to choose the user’s exact

Figure 7: Light editing menu.

6

Page 7: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

location, rotation or height. The right menu - GoTo - provides predefined positions which will moveand rotate the user to the exact named location -Stage, Behind Crowd and Previous Position.

In the Movement menu (left side), the Point andGo button activates a point to where he wants togo mode, also studied by Cabral et al [13]. TheSpacePoint cursor model is replaced with a 3D hu-man stick standing on a blue disc with an arrow at-tached (Figure 9). This model represents the userand its next location and orientation. Also, theuser’s height is increased to allow a better view overthe scenery and increase pointing precision.

The Joystick Mode is based in the typical joy-stick hardware interaction. The three SpacePoint’saxis are used to move (X and Z) or rotate (Y) theuser. Holding the right button, starts the joystickmode and registers the device’s orientation at theclick moment. Calculating the difference betweenall further SpacePoint orientations and the first oneprovides the speed values in all directions and theuser position is changed accordingly.

This approach showed some limitations that arenot found in regular joysticks. Being the SpacePointhold in the hand, the intention to rotate and moveforward at the same time accidentally lead to unde-sired Z rotation, caused by thoughtless arm or handmovement. However, this option was not discardedsince the point and go method - explained in theprevious section - is not ideal for very short move-ments - e.g. drummer is obstructing the view andonly half a meter to the left is necessary to solve theproblem. With the point and go approach, the userwould have to look and point to the ground, halfa meter to his left and be very precise in choosingthe correct location to avoid new obstacles. In joy-stick mode, a slight left rotation in the Z-axis to hisleft and the user starts moving to the left until thebutton is released. Less steps are involved and thefeedback while moving allows to choose the exactposition and stop anytime by releasing the joystickbutton.

Considering the pros and cons of this movementapproach, user tests were carried out to determine

Figure 8: Change position menu.

Figure 9: Point and go process.

the need for such option. Users with previous joy-stick experience agreed in leaving this option avail-able. Some tests were even complete using thismode only, although the limitations.

4. Validation

In order to validate our proposal, it was necessaryto conduct a set of user tests. Despite the impor-tance of lighting experts participation, which couldnot attend on the testing period, VRLight exposesnew interaction paradigms suitable to be tested byanyone. Validating with non-experts could demon-strate how simple can the interaction be, even with-out any knowledge in this field.

Being VRLight a first prototype, with big focuson immersion and interaction, it still is far fromthe complexity and learning effort required by thecommercial software available. Validation againstexisting tools could not be done in time and wasset as the future work in VRLight’s development.

The tests were structured in two stages: three in-teraction tasks in VRLight’s environment, followedby a survey to rate the prototype’s ease of use anda comments section for optional suggestions.

4.1. Testing setup and user profile

Tests were conducted in a closed environment at theInstituto Superior Tcnico in the Taguspark campus.A single laptop computer was used with VRLightfor visualization and interaction, together with anexternal light console software to send DMX inputsto the virtual lighting rig. For visualization, anOculus Rift headset was ready for the user’s tasksas show directors, and an external monitor out-putted the technician’s view, with this control taskbeing carried out by our group. User interactionwas made through the SpacePoint Fusion controllerand the light control using an iPad running a multi-touch light console connected wireless to VRLight.Figure 10 illustrates the setup described during auser test.

A group of 20 users, aging from 18 to 40 andmostly male (70%) were present in the test stage.Only 25% had tried HMDs before, all of them inTaguspark’s development or testing environment.Regarding stage lighting knowledge, only two users

7

Page 8: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

Figure 10: Testing setup.

(10%) had previous DMX experience but none ofthe participants ever used a stage lighting pre-visualization software before. In gesture-based de-vices experience, 55% had tried at least one gesture-based device, and 90% used a joystick before.

Having in mind the low level of stage lightingskills, a brief DMX introduction was given beforethe tests, ensuring the same level of knowledgeneeded to understand the light editing menu.

4.2. Tasks description

Three main tasks were chosen to validate VRLight’scapabilities for stage lighting design. These threetasks can be split for two different purposes:user movement - first task and light editing- second and third tasks. Users were requestedto put on the Oculus Rift headset and holdthe SpacePoint in their hand. Already seeingthe environment, one minute was given for eachuser to know and understand the menus andmovement modes available in the prototype. Thetasks were explained after the first minute of ex-periencing VRLight, without removing the headset.

Task 1 - User movement - This task in-volved moving the user to 4 pre-defined positions,using any of the movement options available -Point and Go, Joystick mode or both combined -and in the fastest time possible. By analysis of theinformal tests during the development, it would nothelp testing the Point and Go movement againstJoystick mode due to this second’s limitations.

Task 2 - Adding light fixtures to an emptystructure - The second task was created tovalidate the use of the light editing menu, speciallythe fixture picking section. Although this specificjob is part of each one’s creativity, being part ofa timed test, the list of fixtures to add to eachslot was previously defined to keep the exactsame task to every user. The list was communi-

cated during the test by a member of our group.The task’s elapsed time was registered for each user.

Task 3 - Aiming lights to the band members- The third timed test consisted in aiming ninePar LED projectors to the five members of therock band standing on stage. With nine fixtures tofive elements, users were allowed to choose whichmodels would have more than one projector aimedto their position. However, no fixture could beforgotten, neither a band member could be leftwithout illumination. Figure 11 shows the stagebefore and after the task is complete.

4.3. Experimental results

At the end of each user test, all execution timesand questionnaire answers were recorded. The timevalues were analyzed and provided informationabout the prototype’s capabilities in average timeto complete tasks and best movement method.The questionnaire obtained user evaluation of theprototype’s ease of use using Likert scales mostly.

Task resultsThe three tasks were completed by all of theparticipants in the tests. With all three tasks beingmade in under three minutes each, results showVRLight’s capabilities to perform simple stagedesign without great difficulty. Time values couldnot be compared with other applications yet.

In the movement task, the chosen method toperform this test was registered to validate whichis the fastest option. Users could choose betweenusing Point and Go, Joystick mode or bothcombined. Time results are charted in Figure 12.The main focus goes to the Point and go method,when compared to its combination with the Joystickmode for slight position corrections. Users who usedboth methods combined did an average of 1:51,21 seconds less than the average Point and go users.

Tasks two and three revealed the ease of use ofour prototype in light fixture editing, with everyuser finishing the task in under 2:15 average. Withnine slots to add or aim fixtures, these values maynot represent much without comparing with othertools, but are significant enough to validate thateach slot takes seconds to edit and all users werecapable of finishing the tasks proposed.

Figure 11: Task 3 - Before and after fixture aiming.

8

Page 9: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

00:00

00:28

00:57

01:26

01:55

02:24

02:52

03:21

Joystick Mode Point and Go PG + JM

Ave

rage

exe

cuti

on

tim

e f

or

each

m

ove

me

nt

(m

inu

tes

and

se

con

ds)

Figure 12: Movement task times using three differ-ent approaches.

Questionnaire resultsIn the questionnaire answered after the three tasks,users analyzed VRLight’s features in terms of easeof use and importance to the interaction.

To survey the users’ favorite movement method,a five-point Likert scale was used to rate each ap-proach in terms of ease of use. The Wilcoxon testwas used to verify the statistically significant differ-ences. Users strongly agreed that the Point and gomethod was the favorite of the two (Z = -3.564; p =0.000). However, when questioned about combiningthe two approaches, 45% of the users agreed thatboth methods combined was the better option, con-firming the advantage of using the Joystick modefor small location corrections. A text section wasadded for comments in both movement techniquesand 40% of the inquiries suggested sensibility cali-brations in the Joystick mode.

Still in user movement, it was asked about theview height and predefined positions’ impor-tance to the interaction in the prototype, using afour-point Likert scale. All users (100%) consid-ered important (or very important) the existence ofthose features in VRLight. In a comments sectionfor missing predefined positions, 15% suggested a”first row” position - in front of the stage.

In the survey’s light editing section, using thefour-point Licker scale (from difficult to very easy)for menu access and button hits revealed 90% and75% very easy, respectively. None of the users clas-sified the previous options as difficult. As for lightaiming, some users (30%) considered the task diffi-cult, mostly due to a bad location from where theuser was aiming the fixture during the test - asconfirmed from the checkbox options in the follow-ing question, used to define the biggest issue thatcaused difficulty in the task.

5. Conclusions

Immersive visualization and interaction is a rela-tively new option in virtual reality systems nowa-days. Several industries still make use of WIMP in-terfaces on regular screens to do their design work.

The advantages of stepping inside the virtual envi-ronment using new interaction paradigms is not yeta possibility in multiple design and pre-visualizationsolutions. Stage lighting simulation is one such case.

VRLight exposes an immersive virtual reality so-lution for lighting pre-visualization in events. Thisprototype - with the interaction illustrated in Fig-ure 13 - introduces the HMD for visualization andgesture-based interaction devices in lighting simu-lation, which have never been tested before, andmay represent an important step for designers andtechnicians in this field. On the other hand, thissolution keeps the best of lighting technology avail-able, by supporting the DMX communication pro-tocol and a scalable approach, enabling easy addi-tion of any new light fixture models to the software.VRLight accepts any type of light consoles, hard-ware or software, allowing the use of the final gearin the preview stage. Such advantage allows tech-nicians to perform console programming while di-rectors pick the best fixtures to the stage structure,all with real-time pre-visualization.

Creativity and focus are increased when uncon-cerning the director of complicated interface issues,leaving light fixture selection and aiming as the onlyfocus in the design. With the communication ap-proach, any light parameter changes are demandedby the director to the technician, who uses his con-sole to send DMX inputs to the environment.

On the interaction side, the reduced complexityin light editing and movement interfaces lets theuser feel increased involvement in event produc-tion. The sense of being ”inside” the venue with theSpacePoint gesture-based device to move and editlight equipment with simple interfaces and toolsraises stage lighting production to another level.

To validate VRLight’s ease of use, it was challeng-ing to test the non-experts capabilities to performstage design tasks in our environment and imposethat this prototype’s simplicity should deliver aneffortless VR solution to perform stage lighting de-sign and pre-visualization. The goal was achievedwith all the participants in the user tests stage be-ing able to finish the three proposed tasks. Testing

Figure 13: VRLight interaction.

9

Page 10: Virtual Reality for Lighting Simulation in Events...Virtual Reality for Lighting Simulation in Events Jo~ao Cintra Torres Reis Araujo joao.reis.araujo@ist.utl.pt Instituto Superior

VRLight against existing tools is one of the futureworks set to this project.

6. Future work

In the prototype presented in this thesis, there aresome aspects that, in spite of being the main focusof this work, are worth of improvements or couldlead to interesting future work. Those are: com-pare VRLight with existing tools (when the levelof options is suitable to set against other software),testing the prototype in a real scenery, adding morelight fixtures and no slot limits, wireless gesture-based devices and the walk-in-place approach formovement. This last one, emerging recently in im-mersive solutions, may bring a new level of freedomto the user, simplifying the overall interface and en-abling real user movement, which increases the im-mersion feeling. These functionalities raise a wholenew set of challenges to be tackled in the future.

References

[1] Carlos E. Ochoa, Myriam B. C. Aries, and JanL. M. Hensen. State of the art in lighting simu-lation for building science: a literature review.Journal of Building Performance Simulation,5(4):209–233, May 2011.

[2] Dzmitry Aliakseyeu, Bernt Meerbeek, JonMason, Harm van Essen, Serge Offermans,Alexander Wiethoff, Norbert Streitz, andAndres Lucero. Designing interactive light-ing. In Proceedings of the Designing InteractiveSystems Conference, DIS ’12, pages 801–802,New York, NY, USA, 2012. ACM.

[3] J. Mason and D. Engelen. Beyond the switch:Can lighting control provide more than illumi-nation?, 2010.

[4] Remco Magielse and Serge Offermans. Futurelighting systems. In CHI ’13 Extended Ab-stracts on Human Factors in Computing Sys-tems, CHI EA ’13, pages 2853–2854, New York,NY, USA, 2013. ACM.

[5] Shariful Shikder. Evaluation of four artificiallighting simulation tools with virtual buildingreference. In Proceedings of the 2009 SummerComputer Simulation Conference, SCSC ’09,pages 430–437, Vista, CA, 2009. Society forModeling & Simulation International.

[6] Pedro A. Lopes, Alfredo Ferreira, andJ. A. Madeiras Pereira. Multitouch interactivedjing surface. In Proceedings of the 7th Inter-national Conference on Advances in ComputerEntertainment Technology, ACE ’10, pages 28–31, New York, NY, USA, 2010. ACM.

[7] Stuart Taylor, Shahram Izadi, David Kirk,Richard Harper, and Armando Garcia-Mendoza. Turning the tables: an interactivesurface for vjing. In Proceedings of the SIGCHIConference on Human Factors in ComputingSystems, CHI ’09, pages 1251–1254, NewYork, NY, USA, 2009. ACM.

[8] Sebastien Cuendet, Engin Bumbacher, andPierre Dillenbourg. Tangible vs. virtual rep-resentations: when tangibles benefit the train-ing of spatial skills. In Proceedings of the7th Nordic Conference on Human-ComputerInteraction: Making Sense Through Design,NordiCHI ’12, pages 99–108, New York, NY,USA, 2012. ACM.

[9] Rebecca Fiebrink, Dan Morris, and Mered-ith Ringel Morris. Dynamic mapping of phys-ical controls for tabletop groupware. In Pro-ceedings of the SIGCHI Conference on HumanFactors in Computing Systems, CHI ’09, pages471–480, New York, NY, USA, 2009. ACM.

[10] Beatriz Sousa Santos, Paulo Dias, Angela Pi-mentel, Jan-Willem Baggerman, Carlos Fer-reira, Samuel Silva, and Joaquim Madeira.Head-mounted display versus desktop for 3dnavigation in virtual reality: a user study. Mul-timedia Tools and Applications, 41(1):161–181,2009.

[11] Carolina Cruz-Neira, Daniel J Sandin,and Thomas A DeFanti. Surround-screenprojection-based virtual reality: the designand implementation of the cave. In Proceedingsof the 20th annual conference on Computergraphics and interactive techniques, pages135–142. ACM, 1993.

[12] Panagiotis Apostolellis, Brennon Bortz,Mi Peng, Nicholas Polys, and Andy Hoegh.Poster: Exploring the integrality and separa-bility of the leap motion controller for directmanipulation 3d interaction. In 3D UserInterfaces (3DUI), 2014 IEEE Symposium on,pages 153–154. IEEE, 2014.

[13] Marcio Cabral, Gabriel Roque, Douglas dosSantos, Luiz Paulucci, and Marcelo Zuffo.Point and go: exploring 3d virtual environ-ments. In 3D User Interfaces (3DUI), 2012IEEE Symposium on, pages 183–184. IEEE,2012.

10


Recommended