+ All Categories
Home > Documents > Movement Visualizer for Networked Virtual Reality Platforms · 2019. 2. 11. · Graphics]...

Movement Visualizer for Networked Virtual Reality Platforms · 2019. 2. 11. · Graphics]...

Date post: 31-Aug-2020
Category:
Upload: others
View: 5 times
Download: 0 times
Share this document with a friend
2
Movement Visualizer for Networked Virtual Reality Platforms Omar Shaikh* Virtual Embodiment Lab, Cornell University Yilu Sun† Virtual Embodiment Lab, Cornell University Andrea Stevenson WonVirtual Embodiment Lab, Cornell University ABSTRACT We describe the design, deployment and testing of a module to track and graphically represent user movement in a collaborative virtual environment. This module allows for the comparison of ground-truth user/observer ratings of the affective qualities of an interaction with automatically generated representations of the participants’ movements in real time. In this example, we generate three charts visible both to participants and external researchers. Two display the sum of the tracked movements of each participant, and a third displays a “synchrony visualizer”, or a correlation coefficient based on the relationship between the two participants’ movements. Users and observers thus see a visual representation of “nonverbal synchrony” as it evolves over the course of the interaction. We discuss this module in the context of other applications beyond synchrony. Keywords: Virtual Reality, Avatars, Nonverbal Behavior, Synchrony, Social Interaction. Index Terms: ACM Classification Keywords: I.3.7 [Computer Graphics] Three-Dimensional Graphics and Realism: Virtual Reality; K.8 [Personal Computing] Games 1 INTRODUCTION With the increased availability of consumer networked virtual reality platforms come more opportunities to study the behavior of users as they engage in social interactions in virtual environments. The head and hand tracking available from most consumer devices allows participants to control their own avatars and see the movements of other users represented by avatars. In addition, researchers and designers (when granted the appropriate permissions) can access participants’ movement data and use these data to interpret the interactions and provide feedback. In the following paper, we describe a module designed to capture users’ movement data and present a visual representation of a simple algorithm designed to capture a specific behavior in real time. Nonverbal behavior has been studied in the context of embodied virtual environments for many years [9]. While people’s behavior in virtual worlds can be intentionally transformed to alter social interactions [1], users often react to nonverbal behavior in virtual worlds similarly to how they would react to such behavior in the physical world. For example, eye gaze that is linked to turn taking during a conversation is preferred to random gaze [4], although behavioral realism may have the reverse effect when avatar appearance is less realistic [8]. Recording movement data is one method of capturing nonverbal behavior and gaining insight into affect [14]. Movement data is also revealing in social contexts [2, 7], such as the potentially prosocial consequences demonstrated in interpersonal synchrony [11] and the link established between synchrony and creative collaboration [15]. Thus, it is a rich source of potential feedback to users in social situations, and also an area of interest to researchers. In this application, we sought to provide novel real time analysis of tracked participant movements to both the participants and the observing researchers. In this study, we elected to use the free platform High Fidelity [6] as a networked virtual environment. However, the principles of designing and testing the application can readily be modified for other systems. This project required a means to store the movement data, and a web interface to provide visual feedback that would allow us to tune the measures in real time. We created an external web server to store and sync data. The external web server resolves the limitations of only using a client-based script by not only building connections between the computer running the script and the computer storing data, but also piping out and storing data. A database connected to the web server allows for complex queries of the piped data. Detailed setup instructions, along with an application programming interface (API), can be found on the GitHub repository [https://github.com/oshaikh13/SynchronousFidelity]. 2 CREATING A “SYNCHRONY VISUALIZERAs a test case, we created a “synchrony visualizer” (Figure 1) that was designed to capture the behavior of two interactants, combine it, and feed this characterization back into the virtual scene. For this case, we compared the summed Euclidean distance frame by frame for each participant using Pearson’s r for a rough measure of synchrony. This method allowed users to capture a general relationship between participant movement without being overly granular [10]. The “synchrony visualizer” updating throughout the course of the interaction provided a concise but informative live movement summary of two individuals in a collaborative virtual environment using consumer systems. Figure 1:The synchrony visualizer shown in-world, with users represented by two generic avatars. *e-mail: [email protected] †e-mail: [email protected] ‡e-mail: [email protected] 681 2018 IEEE Conference on Virtual Reality and 3D User Interfaces 18-22 March, Reutlingen, Germany 978-1-5386-3365-6/18/$31.00 ©2018 IEEE
Transcript
Page 1: Movement Visualizer for Networked Virtual Reality Platforms · 2019. 2. 11. · Graphics] Three-Dimensional Graphics and Realism: Virtual Reality; K.8 [Personal Computing] Games 1

Movement Visualizer for Networked Virtual Reality Platforms

Omar Shaikh* Virtual Embodiment Lab,

Cornell University

Yilu Sun† Virtual Embodiment Lab,

Cornell University

Andrea Stevenson Won‡ Virtual Embodiment Lab,

Cornell University

ABSTRACT

We describe the design, deployment and testing of a module to track and graphically represent user movement in a collaborative virtual environment. This module allows for the comparison of ground-truth user/observer ratings of the affective qualities of an interaction with automatically generated representations of the participants’ movements in real time. In this example, we generate three charts visible both to participants and external researchers. Two display the sum of the tracked movements of each participant, and a third displays a “synchrony visualizer”, or a correlation coefficient based on the relationship between the two participants’ movements. Users and observers thus see a visual representation of “nonverbal synchrony” as it evolves over the course of the interaction. We discuss this module in the context of other applications beyond synchrony.

Keywords: Virtual Reality, Avatars, Nonverbal Behavior, Synchrony, Social Interaction.

Index Terms: ACM Classification Keywords: I.3.7 [Computer Graphics] Three-Dimensional Graphics and Realism: Virtual Reality; K.8 [Personal Computing] Games

1 INTRODUCTION With the increased availability of consumer networked virtual reality platforms come more opportunities to study the behavior of users as they engage in social interactions in virtual environments. The head and hand tracking available from most consumer devices allows participants to control their own avatars and see the movements of other users represented by avatars. In addition, researchers and designers (when granted the appropriate permissions) can access participants’ movement data and use these data to interpret the interactions and provide feedback. In the following paper, we describe a module designed to capture users’ movement data and present a visual representation of a simple algorithm designed to capture a specific behavior in real time.

Nonverbal behavior has been studied in the context of embodied virtual environments for many years [9]. While people’s behavior in virtual worlds can be intentionally transformed to alter social interactions [1], users often react to nonverbal behavior in virtual worlds similarly to how they would react to such behavior in the physical world. For example, eye gaze that is linked to turn taking during a conversation is preferred to random gaze [4], although behavioral realism may have the reverse effect when avatar appearance is less realistic [8].

Recording movement data is one method of capturing nonverbal behavior and gaining insight into affect [14]. Movement data is also revealing in social contexts [2, 7], such as the potentially prosocial consequences demonstrated in interpersonal synchrony [11] and the link established between synchrony and creative collaboration [15]. Thus, it is a rich source of potential feedback to users in social situations, and also an area of interest to researchers. In this application, we sought to provide novel real time analysis of tracked participant movements to both the participants and the observing researchers. In this study, we elected to use the free platform High Fidelity [6] as a networked virtual environment. However, the principles of designing and testing the application can readily be modified for other systems.

This project required a means to store the movement data, and a web interface to provide visual feedback that would allow us to tune the measures in real time. We created an external web server to store and sync data. The external web server resolves the limitations of only using a client-based script by not only building connections between the computer running the script and the computer storing data, but also piping out and storing data. A database connected to the web server allows for complex queries of the piped data. Detailed setup instructions, along with an application programming interface (API), can be found on the GitHub repository [https://github.com/oshaikh13/SynchronousFidelity].

2 CREATING A “SYNCHRONY VISUALIZER” As a test case, we created a “synchrony visualizer” (Figure 1) that was designed to capture the behavior of two interactants, combine it, and feed this characterization back into the virtual scene. For this case, we compared the summed Euclidean distance frame by frame for each participant using Pearson’s r for a rough measure of synchrony. This method allowed users to capture a general relationship between participant movement without being overly granular [10]. The “synchrony visualizer” updating throughout the course of the interaction provided a concise but informative live movement summary of two individuals in a collaborative virtual environment using consumer systems.

Figure 1:The synchrony visualizer shown in-world, with users represented by two generic avatars.

*e-mail: [email protected] †e-mail: [email protected] ‡e-mail: [email protected]

681

2018 IEEE Conference on Virtual Reality and 3D User Interfaces18-22 March, Reutlingen, Germany978-1-5386-3365-6/18/$31.00 ©2018 IEEE

Page 2: Movement Visualizer for Networked Virtual Reality Platforms · 2019. 2. 11. · Graphics] Three-Dimensional Graphics and Realism: Virtual Reality; K.8 [Personal Computing] Games 1

Three graphs reflected the movements of both participants and one measure of the relationship between their movement data. The first graph showed the Pearson’s r correlation of the total movement between two participants over a defined interval. The second and the third graphs showed the summed head and hand movements of first participant (Participant A) and the second participant (Participant B) respectively. The x-axis represents the timestamp, and the y-axis represents the total change of participants’ head and hand movement within ten seconds. Large changes in movement resulted in increases of the y values for the individual participants, and vice versa. However, changes in the combined graph were driven by the relationship between both participants summed movements over 30 second intervals. When participants stayed still, the live movement correlation hovered around zero, because the natural tremor of each participant’s head and hands were not correlated with his or her partner. 3 RESULTS We created two different datasets to assess the accuracy of our system in tracking movements, and in assessing the usefulness of the simple correlation used to characterize nonverbal synchrony. We compared the “ground truth” of validating pairs, who intentionally synchronized their movements, to a control condition. In this control condition, we created dummy pairs by combining the movement datasets of two participants who did not actually interact. For example, Validating Pair 1 would consist of Participants 1A and 1B, but to create a dummy control pair, we would combine the movements of participant 1A with the movements of participant 2B—two people who never actually saw each other in the virtual world. As expected, the correlations between the movements of the validating pairs were much higher (M = .50, SD = 0.09) than the random control pairs (M = .00, SD = 0.08): (t (33.1) = 17.13, p < .001). 4 DISCUSSION In this paper, we used simple correlation coefficients between participants’ summed movements as a measure of nonverbal synchrony. However, we note that there are many methods to characterize synchrony. Among these approaches are utilizing a dynamic model of “interlimb coordination” [13], creating interpersonal synergies from coupling two participants’ degrees of freedom in their movement system [12], utilizing dynamic time warping [3], and finding cointegration between participants [5]. As high resolution of data and low latency of requests are both required to make live R analysis accurate, both the web server and the database are best hosted on the local area network, or on a virtual private server with low latency. In these situations, a single frame of movement data should take 10 to 15 milliseconds to save. We also limited ourselves to a consumer system, which provides head and hand tracking as a default. However, this platform is easily modifiable to support more points of tracking, which would allow for other types of analysis. Applications for this module are far-ranging. We used a single, simple method to capture one version of nonverbal synchrony as an example. As discussed above, other calculations of synchrony could be used. However, many other aspects of movement could also be characterized and visualized in this manner; for example, the expansiveness or speed of a participant’s gestures, or the tendency of participants to share turn-taking. In addition, the visualization does not need to be in the form of a chart. Researchers can represent changing synchrony by altering objects in the world, such as changing the color of the virtual sky,

changing the appearance of users’ avatars, or providing audio cues as feedback to users. Visualizing aspects of the movement data in close to real time can also be informative by allowing aspects of the movement data to be easily compared to other observable variables such as conversation topic, tone of voice, etc. It also allows users to relate quantified movement to self-reported variables, such as fatigue, concentration, and engagement level. In these explorations, we hope that the documented API on GitHub may prove useful to other researchers.

5 CONCLUSION Our “synchrony visualizer” tracked and presented two individuals’ live movements and their correlation in a consumer networked virtual reality platform. We propose that automatically capturing and displaying such subtle nonverbal behaviors in real time can reveal many characteristics of an interaction in a way that informs both users and designers, leading to a wide range of opportunities to increase the range of embodied virtual social interactions.

REFERENCES [1] Jeremy N. Bailenson, Andrew C. Beall, Jack Loomis, James

Blascovich, and Matthew Turk. Transformed social interaction: Decoupling representation from behavior and form in collaborative virtual environments. Presence-Teleop Virt, 13(4): 428-441, 2004.

[2] Frank J. Bernieri. Coordinated movement and rapport in teacher-student interactions. Journal of Nonverbal behavior, 12(2): 120-138, 1988.

[3] Donald J. Berndt and James Clifford. Using dynamic time warping to find patterns in time series. In KDD workshop, 10(16): 359-370, 1994.

[4] Maia Garau, Mel Slater, Simon Bee, and Martina A. Sasse. The impact of eye gaze on communication using humanoid avatars. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, (CHI ‘01) 309-316, 2001.

[5] Clive W. J. Granger. Time series analysis, cointegration, and applications. American Economic Review, 421-425, 2004.

[6] Highfidelity.com, retrieved January 17, 2017. [7] Marianne Lafrance and Maida Broadbent. Group rapport: Posture

sharing as a nonverbal indicator. Group & Org Studies, 1(3): 328-333, 1976.

[8] Kristine L. Nowak and Frank Biocca. The effect of the agency and anthropomorphism on users' sense of telepresence, copresence, and social presence in virtual environments. Presence-Teleop Virt 12(5): 481-494, 2003.

[9] Soo Youn Oh and Jeremy N. Bailenson. Virtual and Augmented Reality. The International Encyclopedia of Media Effects, 2017. Retrieved July 1, 2017 from http://onlinelibrary.wiley.com /doi/10.1002/9781118783764.wbieme0172/full

[10] Fabian Ramseyer and Wolfgang Tschacher. Nonverbal synchrony in psychotherapy: coordinated body movement reflects relationship quality and outcome. Journal of Consulting and Clinical Psychology, 79(3): 284-295, 2011.

[11] Miriam Rennung and Anja S. Göritz. Prosocial Consequences of Interpersonal Synchrony. Zeitschrift für Psychologie, 224(3): 168-189, 2016.

[12] Michael A. Riley, Michael J. Richardson, Kevin Shockley and Verónica C. Ramenzoni. Interpersonal synergies. Frontiers in psychology, 2(38), 2011.

[13] Richard C. Schmidt and Michael J. Richardson. Dynamics of interpersonal coordination. In Coordination: Neural, behavioral and social dynamics. Springer, 281–308, 2008.

[14] Alessandro Vinciarelli, Maja Pantic, Hervé Bourlard, and Alex Pentland. Social signals, their function, and automatic analysis: a survey In Proceedings of the 10th international conference on Multimodal interfaces ACM, 61-68, 2008.

[15] Andrea S. Won, Jeremy N. Bailenson, Suzanne C. Stathatos, and Wenqing Dai. Automatically detected nonverbal behavior predicts creativity in collaborating dyads. Journal of Nonverbal Behavior, 38 (3): 389-408, 2014.

682


Recommended