+ All Categories
Home > Documents > Synchronising movements with the sounds of a virtual partner enhances partner likeability

Synchronising movements with the sounds of a virtual partner enhances partner likeability

Date post: 23-Jan-2017
Category:
Upload: freya
View: 219 times
Download: 4 times
Share this document with a friend
11
RESEARCH REPORT Synchronising movements with the sounds of a virtual partner enhances partner likeability Jacques Launay Roger T. Dean Freya Bailes Received: 19 December 2013 / Accepted: 22 April 2014 Ó Marta Olivetti Belardinelli and Springer-Verlag Berlin Heidelberg 2014 Abstract Previous studies have demonstrated that syn- chronising movements with other people can influence affiliative behaviour towards them. While research has focused on synchronisation with visually observed move- ment, synchronisation with a partner who is heard may have similar effects. We replicate findings showing that synchronisation can influence ratings of likeability of a partner, but demonstrate that this is possible with virtual interaction, involving a video of a partner. Participants performed instructed synchrony in time to sounds instead of the observable actions of another person. Results show significantly higher ratings of likeability of a partner after moving at the same time as sounds attributed to that part- ner, compared with moving in between sounds. Objectively quantified synchrony also correlated with ratings of like- ability. Belief that sounds were made by another person was manipulated in Experiment 2, and results demonstrate that when sounds are attributed to a computer, ratings of likeability are not affected by moving in or out of time. These findings demonstrate that interaction with sound can be experienced as social interaction in the absence of genuine interpersonal contact, which may help explain why people enjoy engaging with recorded music. Keywords Synchronisation Á Virtual interaction Á Rhythm Á Affiliation Introduction Synchronisation has been argued to help establish affilia- tive bonds due to the co-occurrence of the performed movements of self with the perceived movements of another person (e.g. Sebanz et al. 2006; Sommerville and Decety 2006). Perception of the actions of another person engages similar regions of the brain that are involved in making that action ourselves (e.g. Rizzolatti 2005; Buccino et al. 2001; Fadiga et al. 1995; Watkins et al. 2003), meaning that synchronising movement can involve neural coupling between oneself and a co-actor (Semin and Cac- ioppo 2008). This process may promote the establishment of affiliative bonds felt towards a partner by helping us interpret their actions through imitation (Iacoboni 2005) and encouraging empathy towards them (Overy and Mol- nar-Szakacs 2009). Evidence assessing the relationship between synchroni- sation and affiliative behaviour suggests that we relate moving in time with other people to social judgments about them. Perceptual studies demonstrate that agents who move in time with one another are judged as being closely affiliated (Miles et al. 2009b; Hagen and Bryant 2003; Lakens 2010). More participatory studies have shown that while we will tend towards synchronising movements with others (Oullier et al. 2008; Kirschner and Tomasello 2009; Issartel et al. 2007), this tendency can be affected by prior knowledge about whether they are someone we should associate with or not (Miles et al. 2009a, 2011), and our experience of unintentional synchrony can influence how we judge others (Demos et al. 2012). This reflects similar J. Launay Á R. T. Dean Á F. Bailes MARCS Institute, University of Western Sydney, Sydney, NSW, Australia J. Launay (&) Department of Experimental Psychology, University of Oxford, Tinbergen Building, South Parks Road, Oxford OX1 3UD, UK e-mail: [email protected] F. Bailes School of Drama, Music and Screen, University of Hull, Kingston upon Hull, UK 123 Cogn Process DOI 10.1007/s10339-014-0618-0
Transcript
Page 1: Synchronising movements with the sounds of a virtual partner enhances partner likeability

RESEARCH REPORT

Synchronising movements with the sounds of a virtual partnerenhances partner likeability

Jacques Launay • Roger T. Dean • Freya Bailes

Received: 19 December 2013 / Accepted: 22 April 2014

� Marta Olivetti Belardinelli and Springer-Verlag Berlin Heidelberg 2014

Abstract Previous studies have demonstrated that syn-

chronising movements with other people can influence

affiliative behaviour towards them. While research has

focused on synchronisation with visually observed move-

ment, synchronisation with a partner who is heard may

have similar effects. We replicate findings showing that

synchronisation can influence ratings of likeability of a

partner, but demonstrate that this is possible with virtual

interaction, involving a video of a partner. Participants

performed instructed synchrony in time to sounds instead

of the observable actions of another person. Results show

significantly higher ratings of likeability of a partner after

moving at the same time as sounds attributed to that part-

ner, compared with moving in between sounds. Objectively

quantified synchrony also correlated with ratings of like-

ability. Belief that sounds were made by another person

was manipulated in Experiment 2, and results demonstrate

that when sounds are attributed to a computer, ratings of

likeability are not affected by moving in or out of time.

These findings demonstrate that interaction with sound can

be experienced as social interaction in the absence of

genuine interpersonal contact, which may help explain why

people enjoy engaging with recorded music.

Keywords Synchronisation � Virtual interaction �Rhythm � Affiliation

Introduction

Synchronisation has been argued to help establish affilia-

tive bonds due to the co-occurrence of the performed

movements of self with the perceived movements of

another person (e.g. Sebanz et al. 2006; Sommerville and

Decety 2006). Perception of the actions of another person

engages similar regions of the brain that are involved in

making that action ourselves (e.g. Rizzolatti 2005; Buccino

et al. 2001; Fadiga et al. 1995; Watkins et al. 2003),

meaning that synchronising movement can involve neural

coupling between oneself and a co-actor (Semin and Cac-

ioppo 2008). This process may promote the establishment

of affiliative bonds felt towards a partner by helping us

interpret their actions through imitation (Iacoboni 2005)

and encouraging empathy towards them (Overy and Mol-

nar-Szakacs 2009).

Evidence assessing the relationship between synchroni-

sation and affiliative behaviour suggests that we relate

moving in time with other people to social judgments about

them. Perceptual studies demonstrate that agents who move

in time with one another are judged as being closely

affiliated (Miles et al. 2009b; Hagen and Bryant 2003;

Lakens 2010). More participatory studies have shown that

while we will tend towards synchronising movements with

others (Oullier et al. 2008; Kirschner and Tomasello 2009;

Issartel et al. 2007), this tendency can be affected by prior

knowledge about whether they are someone we should

associate with or not (Miles et al. 2009a, 2011), and our

experience of unintentional synchrony can influence how

we judge others (Demos et al. 2012). This reflects similar

J. Launay � R. T. Dean � F. Bailes

MARCS Institute, University of Western Sydney, Sydney, NSW,

Australia

J. Launay (&)

Department of Experimental Psychology, University of Oxford,

Tinbergen Building, South Parks Road, Oxford OX1 3UD, UK

e-mail: [email protected]

F. Bailes

School of Drama, Music and Screen, University of Hull,

Kingston upon Hull, UK

123

Cogn Process

DOI 10.1007/s10339-014-0618-0

Page 2: Synchronising movements with the sounds of a virtual partner enhances partner likeability

findings from mimicry research, which has shown that we

are more likely to mimic people that we have a positive

relationship with Chartrand et al. (2005), Chartrand and

Bargh (1999), Lakin et al. (2003) and LaFrance (1979).

To directly assess a causal relationship between syn-

chronised movement and affiliative behaviour, a recent

study (Hove and Risen 2009) gave participants the

instruction to tap their hand in time with a visual stimulus.

During this tapping, the experimenter sat next to the par-

ticipant and either moved in time with the same stimulus,

or at a different time from it. Ratings of experimenter

likeability were influenced by whether they moved in or

out of time with participants, with higher likeability asso-

ciated with synchronised movement. Similarly, experi-

ments using forms of interaction that occur in everyday life

have shown that walking in time with other people and

moving along to musical stimuli can make us more likely

to trust others in subsequent tasks (Wiltermuth and Heath

2009; Wiltermuth 2012) and also that moving in time with

a partner can make us more likely to help them when they

are placed in a difficult situation (Valdesolo and DeSteno

2011).

While visually perceived synchrony has an established

relationship with affiliative behaviour, there is much less

evidence about how sounds relating to the movement of

another person can influence affiliation (Launay et al. 2013).

It is known that when sounds are related to the movements of

another person, they engage motor regions of the brain as

observed movement can (e.g. Gazzola et al. 2006). However,

associations between abstract sounds and movements are

learnt less directly than these visual associations; while we

are likely to associate our own hand movement with the hand

movement of another person, this certainly would not be true

of a computer-generated sound unless we believe that sound

has been triggered by the movement of a person. As our

interaction with agent-driven sounds (e.g. music) is

increasingly mediated via new technologies (e.g. listened to

on iPods), and unrelated to visual contact with a performer,

we expect that the relationship between synchrony and

affiliation should exist when people interact with a virtual

agent and only hear sounds relating to that agent. Con-

versely, we do not expect interaction to have social effects

when it is entirely attributed to a computer; this relationship

should be mediated by a belief that there is an intentional

human agent involved in performance. Social effects of

musical sound may be mediated purely by attribution of

agency to sound (c.f. Steinbeis and Koelsch 2009), but in the

current experiment, we look at sounds that are associated

with movement and cannot determine whether this agency

attribution alone is sufficient to encourage affiliative

behaviour.

The aim of the current study was to determine whether

synchronisation with a virtual partner could influence

affiliative behaviour. We compare movements made in time

with the sounds of a virtual partner with movements made

out of time with those sounds. This is directly comparable to

research investigating visually perceived synchronisation

with a real partner, and means we can determine whether

co-incidence of action and the sounds of another person has

the same influence as this visual experience.

Measuring affiliation

Experiencing closeness to another person may be manifest

in a number of different ways, but here we are interested in

the desire to engage in pseudoaltruistic behaviour with that

partner (i.e. the extent to which a person is likely to engage

in helpful behaviour expressed towards a partner with no

expectation of subsequent rewards for their help). We used

ratings of likeability as a basic correlate of affiliative

behaviour, as these have been used extensively in similar

experiments and are thought to represent overt positive

sentiments towards another person (Chartrand and Bargh

1999; Hove and Risen 2009; Valdesolo and DeSteno 2011;

Kokal et al. 2011). Another less direct potential correlate

was also assessed for a relationship with affiliation and

virtual eye contact (i.e. eye contact made towards a video

of another person). Ideally, this measure could be assessed

in a similar way in both human and computer conditions, as

well as relating to affiliative intentions. The importance of

eye contact in social behaviour is well documented

(Modigliani 1971; Kleinke 1986; Baron-Cohen 1997;

Shimojo et al. 2003; Guastella et al. 2008a, b; Pinkham

et al. 2008; Moukheiber et al. 2009). Evidence suggests

that more eye contact is likely to be made with someone

with whom we wish to affiliate more (e.g. Kleinke 1986),

eye gaze towards images can be affected by preference

(e.g. Shimojo et al. 2003), and that it is possible to influ-

ence the amount of eye contact that someone will make as

a consequence of experimental task manipulation (Modi-

gliani 1971; Guastella et al. 2008a). For these reasons,

measuring eye contact made towards a video of a person

may be a useful way to assess affiliative intentions felt

towards that person.

We hypothesised that when people believed they were

tapping along with another person, the group told to syn-

chronise would rate their partner as more likeable than the

group told not to synchronise, and this would parallel a

greater increase in virtual eye contact. However, this will

only be the case if auditory information in the absence of

visual contact with a person is sufficient to encourage a

sense of social closeness. In Experiment 2, a group of

participants were told they were tapping with a computer,

and it was expected that this group would not rate their

partner as more or less likeable on the basis of synchro-

nisation. No human partners were used in these

Cogn Process

123

Page 3: Synchronising movements with the sounds of a virtual partner enhances partner likeability

experiments in order to control for relationships that could

develop during human–human interaction (e.g. leader–

follower relationships: Konvalinka et al. 2010).

Method

All participants were psychology students recruited from

The University of Western Sydney in exchange for course

credit, and the study was approved by the University of

Western Sydney Ethics Committee. Participants gave

informed written consent before starting the experimental

procedure.

Design

The design was between-subjects, and participants told to

synchronise with sounds were compared with those told not

to synchronise (told to move at a different time from

sounds, but maintain regularity and tap once for each tone

they heard). Our primary dependent variable used to assess

affiliation was ratings of the likeability of a partner, mea-

sured in a retrospective questionnaire administered to

participants following virtual interaction. Eye gaze directed

towards the eye region of a video of a person (‘‘virtual eye

contact’’) was an indirect, putative correlate of affiliative

intentions and was taken as a repeated measure within

subjects, before and after the synchronisation paradigm.

Stimuli

Over three different rounds of a tapping game, tones were

sequenced using MAX/MSP and lasted 72 taps, involving

organised anisochronic (i.e. irregular) sequences, similarly

created to Madison and Merker (2002). The first and third

round used an underlying tempo of 600 ms, and the middle

round used an underlying tempo of 800 ms (intended to

introduce variety rather than as an experimental manipu-

lation). All of these sequences became increasingly

isochronous (i.e. more regular) throughout the sequence.

Each interval in the sequence of sounds was either short-

ened or lengthened from the underlying tempo, as deter-

mined by a Kolakoski sequence (with zeros in the sequence

corresponding to shortened intervals, and ones corre-

sponding to lengthened intervals). In the first trial, this

anisochronous sequence started with shortened and

lengthened intervals of 552 and 648 ms, and anisochrony

decreased in intervals of 4 ms every eight taps made. The

second trial started at 752 and 848 ms and changed simi-

larly to the first trial. The final trial started with intervals of

564 and 636 ms and changed by only 3 ms every eight

taps. The tapping paradigm lasted approximately 5 min for

each participant.

Procedure

Participants were told that the experiment was about

rhythm and faces, and that they would be playing a tapping

game with a partner, with whom they would either be told

to synchronise or not to synchronise. First, all participants

completed four practice rounds of both of these instruc-

tions, co-ordinated using MAX/MSP running on a Mac-

Book Pro, in which they tapped along on a Roland

Handsonic HPD-10 drum pad with an isochronous beat

heard over Sennheiser HD 650 headphones, and were given

visual feedback on-screen about their success; a circle on

the screen flashed green if they tapped correctly or red if

they tapped incorrectly. Although the instructions said that

participants should ‘‘try to beat as regularly as possible on

the drum pad’’, regularity of tapping was not assessed as

part of the practice.

Following the practice session, the experimenter

returned to the room to give a final brief on the experiment

and to adjust participants’ seating arrangements so that

they were the correct distance and height from a Tobii

T120 Eye Tracker (*65 cm from the tracking device).

When the participant was alone in the room, the experi-

ment continued, first running a calibration of the eye

tracker, then tracking eye movements while playing a

video (using the Python module ‘‘pygame’’), which fea-

tured a female experimenter (unknown to the participants)

giving instructions on what to do during the tapping task

(still frames of videos from Experiments 1 and 2 in Figs. 1

and 2, respectively).

Fig. 1 Still frame from video in Experiment 1. Eye gaze data from

one participant has been superimposed. Green rectangle indicates the

window used to identify the eye region. Red dots indicate right eye

track events, and blue dots indicate left eye track events. The

experimenter appearing in the image has given written informed

consent for publication of their photograph (colour figure online)

Cogn Process

123

Page 4: Synchronising movements with the sounds of a virtual partner enhances partner likeability

During the video, the position of each eye on the screen

was sampled from data collected by the eye tracker every

10 ms using T2T software (Filippin 2011). The whole

experiment was sequenced using Python v. 2.6.4.

Participants then played three rounds of the tapping

game, which should be long enough to influence social

tendencies towards a partner (c.f. Launay et al. 2013). Each

participant was either given the instruction to synchronise

or not synchronise, and this instruction was determined by

an automated script so that the experimenter was blind to

the condition that participants were in until the experiment

was underway. Participants were told that the first ten tones

they heard would be played by the computer, and following

this pacing sequence, the timbre of the tones changed, and

a picture of their supposed tapping partner appeared in the

corner of the screen, to give the impression that they had

started hearing the tones played by this partner. Both the

pacing timbre and that related to the partner were percus-

sive MIDI sounds.

Following the tapping game, the eye calibration was

repeated and was again followed by an instruction video

explaining what would happen next. During this video, eye

tracking data were recorded similarly to the first video. The

final part of the experiment displayed questions on the

computer screen and asked participants to use a scale from

1 to 7 to answer the following:

‘‘How synchronised were you with you partner?’’1

(1 = closely matched, 7 = very far apart)

‘‘How successfully did you follow the instruction

given?’’ (1 = successful, 7 = unsuccessful)

‘‘How likeable was your partner?’’

(1 = likeable, 7 = not likeable)

‘‘How interesting did you find the task?’’

(1 = interesting, 7 = uninteresting)

A message then appeared on the screen saying ‘‘In this

experiment, some participants tapped with another person

while others tapped with the computer’’, and then a final

question appeared:

‘‘Were you tapping with a person or the computer?’’

Finally, the experimenter returned and asked partici-

pants what they thought the experiment was about (to

check for demand characteristics). Participants were then

debriefed on the full aims of the study and were asked to

complete the Ollen Musical Sophistication Index (Ollen

2006), to assess musical background. The whole experi-

ment lasted approximately 30 min. The experimenter who

ran the sessions was not the person appearing in videos and

was not described as the tapping partner to participants.

Analysis

Likeability ratings were analysed without transformation

using nonparametric tests, because the data are ordinal.

Each piece of eye tracking data was assessed according to

whether it occurred inside the eye region of the person

appearing in the video or not (a window around the eyes

was determined before testing began and is given in Fig. 1

for Experiment 1 and Fig. 2 for Experiment 2). At each

time point (before and after tapping), the proportion of

tracked data that occurred within this region was calculated

by dividing the number of eye track events within the eye

region by the total number of eye track events recorded.

This was performed for left and right eye tracking data

separately, and the mean of these two was taken at each

time point as the dependent variable of interest.

In order to objectively measure the amount of synchro-

nisation participants achieved, two commonly used mea-

sures (e.g. Tognoli et al. 2007; Kirschner and Tomasello

2009; Konvalinka et al. 2010) were calculated using circular

statistics (Mardia and Jupp 2000). The synchronisation index

(known as ‘‘R’’ in circular statistics) gives an indication of

synchronisation strength regardless of phase (i.e. identifies

the degree to which taps and tones had matching periodicity,

regardless of whether the two were actually occurring at the

same time) and can range from 0 to 1, with higher values

suggesting a greater degree of stability between tap and tone

times. The mean circular asynchrony (known as ‘‘a’’ in cir-

cular statistics) is a measure of the mean distance between

the tap and tone times, and here ranged from -p to p in

Fig. 2 Still frame from video in Experiment 2. Eye gaze data from

one participant has been superimposed. Green rectangle indicates the

window used to identify the eye region. Red dots indicate right eye

track events, and blue dots indicate left eye track events. The

experimenter appearing in the image has given written informed

consent for publication of their photograph (colour figure online)

1 In conditions where participants had been told not to synchronise,

this was preceded by ‘‘Given that your aim was not to synchronise,’’.

Cogn Process

123

Page 5: Synchronising movements with the sounds of a virtual partner enhances partner likeability

radians (-180� to 180�) in conditions where participants

were told to synchronise, and 0–2 p (0�–360�) in conditions

where participants were told not to synchronise, with values

closer to zero suggesting that on average, tap and tone times

were closer together, and values close to ±p suggesting tap

times occurring halfway between tone times. In synchroni-

sation trials, taps were matched to the closest tone to calcu-

late asynchronies, while in trials in which participants were

instructed not to synchronise, taps were all matched to the

previous tone heard. Circular statistics were used to calculate

R and a per trial, using methods extensively outlined else-

where (Mardia and Jupp 2000), and these will now be

referred to as the synchronisation index, and mean circular

asynchrony, respectively. The synchronisation index was

averaged over all recorded tapping trials for each participant.

The mean asynchrony was only used to determine whether

participants had performed according to the instructions

given.

In statistical tests, one-tailed tests have been used where

we had specific predictions about differences between

conditions, and the expected direction of this difference.

Two-tailed tests have been used for measures that were not

expected to differ between conditions.

Experiment 1

In Experiment 1, the instruction videos that participants

watched each lasted 22 s, and all participants were told that

they would be tapping along with the person appearing in

this video.

Thirty-four participants were tested; 18 in the ‘‘syn-

chronise’’ condition (Age: M = 22, SD = 6, 2 male) and

16 in the ‘‘don’t synchronise’’ condition (Age: M = 21,

SD = 6, 2 male). One participant was excluded in the

‘‘synchronise’’ condition due to tapping between the tones

instead of tapping at the same time as the tones, assessed

using the synchronisation index and mean circular asyn-

chrony. Another participant was excluded from eye gaze

analysis in the ‘‘don’t synchronise’’ condition because the

proportion of eye gaze in the eye region of the video

decreased significantly more (greater than 2.5 SD) than the

mean of the rest of the group. As this change was in the

predicted direction, including this participant did not

change the reported result, and only increased its signifi-

cance. Due to a technical problem, tapping data were not

recorded for two participants, so their synchronisation

indices are not included in analysis.

Results

Questionnaire data relating to synchronisation and likeabil-

ity were assessed in order to check that the manipulation had

worked, and whether the hypothesised relationship with

likeability ratings existed (Table 1).

Wilcoxon 2-sample rank-sum tests show that ratings of

synchronisation were significantly different between the

‘‘synchronise’’ group and the ‘‘don’t synchronise’’ group

(p \ 0.001, one-tailed), with participants experiencing the

conditions as expected. A significant difference between

participants’ ratings of how much they liked their partner

(p \ 0.01, one-tailed) supports the hypothesis that partici-

pants who synchronised during the tapping task rated their

partner as more likeable than those who did not synchronise.

Differences between the two groups in ratings of success

and interest in the task were not expected. However, a near

significant difference in the ratings of success was apparent

(p = 0.08, two-tailed), suggesting that participants who

synchronised rated themselves as more successful than

those who did not synchronise—this will be discussed

further in Experiment 2. No difference was found in

interest in the task (details in Table 1).

Additionally, Spearman’s tests revealed a correlation

between participants’ ratings of likeability of their partner,

and the synchronisation index (rho = -0.36, p = 0.05,

n = 33), suggesting that better synchronisation was asso-

ciated with more favourable ratings of likeability

(1 = most likeable). A near significant negative correlation

between ratings of likeability and changes in eye gaze

(rho = -0.34, p = 0.056, n = 32) suggests that rating a

partner as more likeable was associated with more positive

changes in virtual eye contact following the tapping task.

However, there was no significant correlation between the

synchronisation index and the change in virtual eye contact

(rho = 0.30, p = 0.11, n = 32), so quality of synchroni-

sation did not directly relate to changes in virtual eye

contact made.

A 2 (timepoint: before, after) 9 2 (instruction: ‘‘syn-

chronise’’, ‘‘don’t synchronise’’) mixed ANOVA was per-

formed on the eye gaze data to determine whether virtual

Table 1 Results from comparisons of questionnaire data in Experi-

ment 1

Rating ‘‘Synch’’

mean (SD)

‘‘Don’t’’

mean (SD)

Test statistic

Synchronisation

(1 = at the same

time)

2.71 (1.15) 4.75 (1.29) W = 34.5,

p = 0.0001

(one-tailed)

Likeability

(1 = most

likeable)

1.41 (0.61) 2.88 (1.71) W = 61,

p = 0.002

(one-tailed)

Success (1 = most

successful)

2.58 (1.62) 3.81 (1.97) W = 87, p = 0.08

(two-tailed)

Interest (1 = most

interesting)

2.35 (1.5) 2.62 (1.59) W = 120,

p = 0.56

(two-tailed)

Cogn Process

123

Page 6: Synchronising movements with the sounds of a virtual partner enhances partner likeability

eye contact changed as a consequence of the tapping

condition participants were in. This revealed no main

effects of timepoint, F(1, 30) = 0.31, p = 0.58, or

instruction, F(1, 30) = 0.004, p = 0.95, but a significant

interaction between the two, F(1, 30) = 10.1, p = 0.003,

g2 G = 0.02. This interaction expresses a decrease in vir-

tual eye contact from before the tapping task to after it for

participants in the ‘‘don’t synchronise’’ condition, and an

increase for those in the ‘‘synchronise’’ condition, as shown

in Table 2.

Exactly half of the participants reported believing that

they were interacting with a person in both the ‘‘synchro-

nise’’ and ‘‘don’t synchronise’’ conditions. All the analyses

reported above were repeated on both halves of the data

separately (i.e. analysis was repeated for the subset of

participants who believed they were interacting with a

person, and those who did not believe they were interacting

with a person) and did not demonstrate any substantial

differences between the two subsets. The similarity of

results between these two groups demonstrates that this

post-test self-report does not separate participants into

substantially different populations. In Experiment 2, in

order to further assess the effects of believing that an

interaction partner is a computer, participants’ beliefs

about their partner were manipulated, with instructions

given about the nature of the partner prior to interaction.

Experiment 2

Since the experiments in this study do not involve genuine

interpersonal contact with a human partner, it is possible to

manipulate the attribution of agency to sound and to

determine how this can influence subsequent behaviour. In

the absence of human interaction, successfully matching

movement with sound could alone be sufficient to induce a

more positive mood in participants and lead to increased

affiliative intentions. Alternatively, it may be important for

participants to believe that their movement is matched to

that of another person in order for it to have these effects.

Experiment 2 was designed to replicate the results of

Experiment 1 and to follow it up by testing whether the

results were dependent on believing that tapping interac-

tion was with another human player. In addition to the

groups told to synchronise and not to synchronise, a further

factor was added, relating to the description of the ‘‘part-

ner’’ involved in the tapping game. In one condition, this

description was similar to Experiment 1 (‘‘human’’ condi-

tion), while in the second condition, participants were told

that they would be tapping with a computer partner

(‘‘computer’’ condition), and an image of a computer was

displayed during the tapping task. Both conditions used a

new set of video recordings, lasting 21 s, involving a dif-

ferent female experimenter.

In addition, a change was made to the instructions in

Experiment 2, so that instead of being told ‘‘synchronise’’ or

‘‘don’t synchronise’’, participants were told to tap ‘‘with’’ or

‘‘between’’ the tones that they heard. These descriptions

were used throughout the practice, experiment and instruc-

tions, and were explained with exactly the same details as in

Experiment 1, but without the use of negative language

characterising the ‘‘don’t synchronise’’ instruction.

In total, 88 participants were tested. Participants who

performed incorrectly (i.e. not as instructed as assessed by

the synchronisation index and mean circular asynchrony)

were excluded from all data analysis (11 participants). In

addition, for the analysis of eye tracking data, all partici-

pants who had fewer than 100 data points recorded for both

eyes at any time point were excluded from analysis (10

Table 2 Summary of virtual

eye gaze results

Values given are the mean and

standard deviation of the

proportion of eye gaze within

the eye region of video

Experiment Condition Instruction Timepoint

Before

tapping

After

tapping

1 Synchronise 0.37 (0.27) 0.41 (0.27)

Don’t synchronise 0.41 (0.15) 0.34 (0.16)

2 Human With 0.58 (0.26) 0.46 (0.25)

Between 0.48 (0.18) 0.40 (0.19)

Computer With 0.44 (0.22) 0.38 (0.24)

Between 0.67 (0.24) 0.60 (0.21)

Table 3 Number of participants included in each group in Experi-

ment 2

Condition Data

type

‘‘With’’ ‘‘Between’’

N (n male) Age:

M (SD)

N (n male) Age:

M (SD)

Human Ratings 21 (7) 23 (9) 20 (5) 20 (3)

Eye

data

19 (5) 22 (9) 17 (4) 21 (3)

Computer Ratings 18 (2) 21 (3) 18 (3) 22 (6)

Eye

data

17 (2) 21 (3) 13 (2) 23 (6)

Cogn Process

123

Page 7: Synchronising movements with the sounds of a virtual partner enhances partner likeability

participants); this could occur if participants did not attend

to the screen during the tracking process, or could be due to

calibration problems. Following this, outliers with change

in eye contact of more than 2.5 standard deviations above

and below the group mean were also excluded (one addi-

tional participant). The final sample is detailed in Table 3.

Results

One-tailed Wilcoxon 2-sample rank-sum tests were per-

formed on questionnaire data for synchronisation and

likeability. Full details of questionnaire data are given in

Table 4.

In the human condition, significant differences were

found between the ‘‘with’’ and ‘‘between’’ groups for rat-

ings of synchronisation (p \ 0.001, one-tailed) and like-

ability (p \ 0.05, one-tailed, see Fig. 3), in agreement with

Experiment 1. Twenty-seven participants out of 41 repor-

ted believing that they were interacting with a human.

Similar statistics were calculated for participants in the

computer condition, and these demonstrated a significant

difference between the groups for ratings of synchronisa-

tion (p \ 0.001, one-tailed), but no difference between the

group ratings of likeability, suggesting that this relationship

depended on believing that the interaction was with a

person. Only three participants out of 36 reported thinking

their partner was a human in the computer condition.

When participants were told they were interacting with

another person, a significant difference existed between rat-

ings of success for those told to tap with tones compared with

those told to tap between tones (p \ 0.05, one-tailed), but not

when people were told they were interacting with a computer.

No difference was found in ratings of interest in the task in the

human condition or the computer condition (details in

Table 4). Again, these results are largely in agreement with

Experiment 1, although the higher ratings of success in the

‘‘with’’ condition are worth further investigation, and this is

described in the next section (Analyses of Success).

Spearman’s tests were again used to determine whether

correlations existed between the synchronisation index

and measures of affiliative behaviour (change in eye gaze

and ratings of likeability). As in Experiment 1, a signifi-

cant relationship existed between ratings of likeability and

the synchronisation index for participants who were told

they were interacting with a person (rho = -0.51,

p = 0.0007, n = 41), but this was not the case for par-

ticipants who were told they were interacting with a

computer (rho = -0.09, p = 0.59, n = 36). Unlike in

Experiment 1, there was no correlation between ratings of

likeability and change in eye gaze in either the human

condition (rho = 0.11, p = 0.51, n = 36) or computer

condition (rho = 0.08, p = 0.67, n = 30), suggesting that

any changes in eye gaze did not relate to participants’

subjective judgement of their partner. As in Experiment 1,

no relationship existed between the synchronisation index

and eye gaze change in either the human (rho = -0.13,

p = 0.45, n = 36) or computer conditions (rho = 0.17,

p = 0.36, n = 30).

Eye data in the human condition were analysed in the

same manner as in Experiment 1. This 2 (timepoint: before,

after) 9 2 (instruction: ‘‘with’’, ‘‘between’’) mixed ANOVA

revealed only a main effect of timepoint, F(1, 34) = 21.5,

p = 0.00005, g2 G = 0.05, with no main effect of instruc-

tion, F(1, 34) = 1.25, p = 0.27, and no interaction between

the two, F(1, 34) = 1.17, p = 0.29. This significant effect

of timepoint indicates a decrease in eye contact from before

the tapping phase to after the tapping phase in both condi-

tions (details in Table 2). This means the result of Experi-

ment 1 relating to changes in eye gaze was not replicated.2

Table 4 Results from comparisons of questionnaire data in Experiment 2

Condition Rating ‘‘With’’ M (SD) ‘‘Between’’ M (SD) Test statistic

Human Synchronisation (1 = ‘‘at the same time’’) 2.81 (1.03) 4.10 (1.07) W = 85, p = 0.0004 (one-tailed)

Likeability (1 = ‘‘most likeable’’) 2.29 (1.06) 3.00 (1.21) W = 140, p = 0.03 (one-tailed)

Success (1 = ‘‘most successful’’) 2.67 (1.32) 3.60 (1.27) W = 129.5, p = 0.02 (one-tailed)

Interest (1 = ‘‘most interesting’’) 2.67 (1.31) 2.80 (1.51) W = 204, p = 0.88 (two-tailed)

Computer Synchronisation 3.28 (0.67) 4.61 (1.03) W = 48.5, p = 0.0001 (one-tailed)

Likeability 3.06 (1.30) 2.89 (1.37) W = 174.5, p = 0.70 (two-tailed)

Success 3.16 (1.39) 2.56 (1.29) W = 204, p = 0.18 (two-tailed)

Interest 2.50 (1.15) 2.33 (1.19) W = 178.5, p = 0.60 (two-tailed)

2 Following null results relating to the change in virtual eye gaze in

Experiment 2, further analyses were performed on the eye tracking

data (on pupil dilation, variability of gaze within and outside eye

region of image, and time course of eye gaze change) to determine

whether an effect was being masked by the virtual eye contact

measure. These did not reveal any reliable differences between the

‘‘with’’ and ‘‘between’’ conditions. A third experiment was conducted

as a replication of Experiment 1 using the minimum participants

statistically required to demonstrate the effect identified in Experi-

ment 1. As in Experiment 2, these results did not reveal any

significant differences between eye gaze in the two instruction

conditions and demonstrated a non-significant trend towards increases

in eye gaze being associated with rating a partner as less likeable.

Cogn Process

123

Page 8: Synchronising movements with the sounds of a virtual partner enhances partner likeability

In the computer condition, a similar 2 (timepoint:

before, after) 9 2 (instruction: ‘‘with’’, ‘‘between’’) mixed

ANOVA on eye data demonstrated a main effect of time-

point, F(1, 28) = 4.5, p = 0.007, g2 G = 0.2, but also a

main effect of instruction, F(1, 28) = 8.44, p = 0.04 g2

G = 0.02, and no interaction between the two, F(1,

28) = 0.03, p = 0.87. The main effect of timepoint again

suggests a decrease in virtual eye contact over the experi-

ment. The main effect of instruction suggests that partici-

pants in the ‘‘between’’ condition were making

significantly more virtual eye contact than participants in

the ‘‘with’’ condition before this instruction was given, so

this effect cannot be a consequence of this instruction.

In summary, Experiment 2 demonstrated the predicted

relationship between instructions given and likeability

ratings. Eye gaze data did not show any differences

according to instructions given, and did not exhibit any

relationship with likeability ratings. The synchronisation

index correlated with likeability ratings, but only in the

condition where participants were told they were interact-

ing with another person.

Analyses of success

Experiment 2 revealed significant differences in partici-

pants’ ratings of success if they were told they were tap-

ping with another person. This is problematic as it suggests

that an effect of success may be combining with any effects

caused by the synchronisation manipulation. It is therefore

important to establish whether people were indeed more

successful in the ‘‘with’’ condition, compared with the

‘‘between’’ condition, and this was done by comparing the

synchronisation index between the ‘‘with’’ and ‘‘between’’

conditions. Results demonstrate that participants were

performing differently if they were told to tap with the

tones (M = 0.88, SD = 0.07) compared with those told to

tap between them (M = 0.60, SD = 0.27; Kolmogorov–

Smirnov test: D = 0.61, p = 0.0006), with much larger

variability in the behaviour of those told not to synchro-

nise. This suggests that the group told to tap with the tones

were more successful than those told to tap between the

tones, in the human condition.

In the computer condition, there was no influence of

synchronisation condition (either ‘‘with’’ or ‘‘between’’),

on ratings of success, and a more detailed analysis dem-

onstrated why this was the case. While correlations show a

relationship between the synchronisation index and ratings

of success when participants were told they were tapping

with a person (rho = -0.56, p = 0.0002, n = 41), this did

not exist when they were told they were tapping with a

computer (rho = 0.03, p = 0.85, n = 36). However, a

relationship was identified between ratings of success and

ratings of likeability in both conditions, with a similar

magnitude (human: rho = 0.43, p = 0.005, n = 41; com-

puter: rho = 0.42, p = 0.001, n = 36). This critical find-

ing suggests that the relationship between synchronisation

and experience of success was mediated by a belief that the

interaction was with a person, rather than a computer

partner.

Discussion

The current study replicates Hove and Risen’s (2009)

findings relating to likeability ratings, but does so with a

virtual partner, in the absence of visual contact with

another person. Synchronisation with a partner led to

higher ratings of likeability of that partner. This establishes

that a relationship can exist between synchronisation with

sounds attributed to another person and how that person is

judged even if they are not physically present, which has

implications for the way that we understand associations

between sound and the movements of other people. How-

ever, a reliable correlate of this was not identified in

measures of eye gaze.

The key recurring result in this study was a difference in

ratings of likeability found between the participants told to

synchronise and those told not to synchronise, even when

no negative instruction was given. On a general level, this

supports the hypothesis that synchronising with other

Fig. 3 Likeability ratings by condition and instruction in Experiment

2. Bars indicate standard error

Cogn Process

123

Page 9: Synchronising movements with the sounds of a virtual partner enhances partner likeability

people can have a positive effect on how we feel about

them (Hove and Risen 2009; Wiltermuth and Heath 2009;

Valdesolo and DeSteno 2011). Importantly, however, the

current study did not involve visual interpersonal contact

with a partner during interaction and compared two con-

ditions in which participants were either moving at the

same time as perceived events or in the gaps between

perceived events. This supports the notion that a relation-

ship between synchronisation and affiliative behaviour is

not restricted to instances where people are moving in time

with visually observable human motion, but can also exist

in relation to sounds associated with that motion.

Observing the movement of another person may evoke

specific associations we have with making a similar

movement ourselves. It has previously been shown that

there are neural associations between perceiving human

movements and producing similar movements (Gazzola

et al. 2006; Engel et al. 2009). Most previous studies

addressing affiliative effects of synchronisation (e.g. Hove

and Risen 2009; Wiltermuth and Heath 2009; Valdesolo

and DeSteno 2011) involved similar movements for all

people involved, meaning a clear link could exist between

enacted and observed movement. It is via this co-occur-

rence of perceived and performed movement that people

are thought to experience a sense of social closeness to an

interaction partner with whom they have synchronised (e.g.

Sebanz et al. 2006; Overy and Molnar-Szakacs 2009). As

the current study used sounds with no inherent or recently

learnt associations with human movement, a relationship

between perception and action could only exist if mediated

by the belief that sounds being made were created by

similar movement of another person.

With regard to action–perception associations, the cur-

rent result can be interpreted in two very different ways. A

weak argument might state that the act of synchronising

with any set of perceived events is sufficient to produce

affiliative effects in the actor–perceiver. A stronger argu-

ment would say that the sounds that were used in the

current experiment are (by virtue of being described as

being made by another person tapping) associated with

movement of the perceiver via action–perception networks.

This association can create the link from action to per-

ception, which may help effect some sense of union with

the perceived other.

Ideally, the computer condition introduced in Experiment

2 would dissociate between these two arguments. Differ-

ences in affiliative measures in both the human and computer

conditions would suggest that associating movement with

that of another person is not necessary to produce changes in

social engagement. However, likeability ratings are not a

useful measure in this instance because people are unlikely to

rate a computer as more or less likeable in any normal situ-

ation. While we cannot dismiss the weaker argument on the

basis of our findings, the study by Hove and Risen (2009)

showed that when people moved in time with a visual

stimulus, but not with an experimenter, the kind of synchrony

achieved did not affect how likeable the experimenter was

reported to be. This suggests that belief in agency is relevant

to relating synchronisation with affiliation.

This finding is particularly interesting in our understanding

of why people enjoy music in the absence of interpersonal

contact. Historically, music has always been performed by

observing others, and it is only in the few last centuries that

recorded music has been possible. It has often been suggested

that music has the power to move us, both physically through

dance and emotionally (Eitan and Granot 2006; Freedberg and

Gallese 2007). However, this association with movement

would most naturally come from the observation of other

people and engagement with those other people. Here, we

have demonstrated that provided sound is associated with

agency it can have social consequences, which might explain

why music, as sound that has been triggered and organised by

a person, is still engaging when experienced alone. We have

not here disentangled the relative influence of attribution of

agency and attribution of movement to sound, and this dis-

tinction is still important; when we enjoy music that we do not

associate with human movement (e.g. computer music) but

know has been created by a person, it may or may not have

social effects as described here.

An important issue is that people did appear to experience

different levels of success in conditions where they were told

to synchronise or not to synchronise. This is likely to be a

confounding factor in experiments of this kind as success

may always influence positivity (Isen 1970). In the current

experiment, despite pertinent design, synchronisation and

success were not very well distinguished. We must therefore

conclude that the current result relating to likeability could

include effects of both synchronisation and success. Criti-

cally, however, we found that success ratings only correlated

with objectively measured synchronisation for participants

who were told they were tapping with a person, and not for

those who were told they were tapping with a computer. This

is important because it suggests that success acts as a

mediator of a relationship between synchronisation and

affiliation in human interaction but not after synchronisation

with sound alone. While our current result relating to like-

ability may be a consequence of both synchronisation and

success, we know that the experience of success relates to

that of synchronisation with other people. Synchronisation is

the critical factor influencing affiliation, but it may partially

be mediated by the experience of success during human

interaction.

Giving instructions to intentionally synchronise in the

current study did not prohibit the development of a rela-

tionship between synchronisation and affiliation, so this

paradigm can be further explored in future. Using the

Cogn Process

123

Page 10: Synchronising movements with the sounds of a virtual partner enhances partner likeability

instructions ‘‘with’’ and ‘‘between’’ instead of ‘‘synchro-

nise’’ and ‘‘don’t synchronise’’ in the second experiment

did not appear to have any qualitative effect on likeability

results. This means that the results cannot be explained

simply by the positive and negative implications of the

instructions given. During debriefing, no participant

reported thinking that the experiment was about how syn-

chronisation could relate to affiliative behaviour, so we can

assume that general demand characteristics were not

introduced by giving instructions on how to perform during

the tapping task.

The current study did not find any reliable relationship

between eye measures and affiliative behaviour. In

Experiment 1, where change in virtual eye contact did

appear to be different between the two synchronisation

groups, the measure demonstrated a correlation with rat-

ings of likeability in the expected direction. However, in

Experiment 2 and a further experiment not reported here

(see Footnote 2), this relationship did not exist, and change

in virtual eye contact was not significantly different for the

two experimental groups. The correlation might therefore

be incidental in the first experiment, and the significant

difference in virtual eye contact identified between the two

synchronisation groups could be an artefact of this rela-

tionship. Since this correlation was inconsistent, we can

conclude that the relationship between the eye gaze mea-

sure and affiliation was not reliable in the experimental

conditions reported here, rather than the decrease in virtual

eye contact in Experiment 2 in the ‘‘synchronise’’ condi-

tions indicating lower affiliative intentions.

Given that the literature suggested that eye contact

might be an interesting measure, it is likely that the current

paradigm was too far removed from genuine interaction

and did not encourage participants to engage with the video

as they would with a real person. It is also feasible that eye

gaze is only altered in certain conditions of social

engagement, which may not be directly related to or

expressive of liking. In the current study, participants were

not told in advance that they should make any judgements

about the person appearing in the video, which may have

led them to attend less to this than they would to still

images that they were having to evaluate in some way (as

in e.g. Shimojo et al. 2003).

Conclusions

The current study identified a relationship between moving

in time with sounds attributed to a human partner and ratings

made of the likeability of that partner. Greater stability in

synchronisation as indicated by the synchronisation index

correlated with participants’ judgements about their partner.

When participants were informed that they were tapping

along with a computer tapper, relationships between syn-

chrony and other dependent measures did not develop. This

finding suggests that when people hear sounds associated

with the movement of others, they are able to experience this

in a social manner, without the need for any visual, or

physically experienced human presence.

References

Baron-Cohen S (1997) Mindblindness: an essay on autism and theory

of mind. MIT Press, Cambridge

Buccino G, Binkofski F, Fink GR, Fadiga L, Fogassi L, Gallese V,

Seitz RJ, Zilles K, Rizzolatti G, Freund HJ (2001) Action

observation activates premotor and parietal areas in a somato-

topic manner: an fMRI study. Eur J Neurosci 13(2):400–404

Chartrand TL, Bargh JA (1999) The Chameleon effect: the percep-

tion-behavior link and social interaction. J Pers Soc Psychol

76(6):893–910

Chartrand TL, Maddux WW, Lakin JL (2005) Beyond the perception-

behavior link: the ubiquitous utility and motivational moderators

of nonconscious mimicry. In: Hassin RR, Uleman JS, Bargh JA

(eds) The new unconscious. Oxford University Press, New York,

pp 334–361

Demos AP, Chaffin R, Begosh KT, Daniels JR, Marsh KL (2012)

Rocking to the beat: effects of music and partner’s movements

on spontaneous interpersonal coordination. J Exp Psychol Gen

141(1):49–53

Eitan Z, Granot RY (2006) How music moves. Music Percept

23(3):221–248

Engel LR, Frum C, Puce A, Walker NA, Lewis JW (2009) Different

categories of living and non-living sound-sources activate

distinct cortical networks. Neuroimage 47(4):1778–1791

Fadiga L, Fogassi L, Pavesi G, Rizzolatti G (1995) Motor facilitation

during action observation: a magnetic stimulation study. J Neu-

rophysiol 73(6):2608

Filippin L (2011) T2T Package. http://psy.ck.sissa.it/t2t/About_T2T.

html

Freedberg D, Gallese V (2007) Motion, emotion and empathy in

esthetic experience. Trends Cogn Sci 11(5):197–203

Gazzola V, Aziz-Zadeh L, Keysers C (2006) Empathy and the

somatotopic auditory mirror system in humans. Curr Biol

16(18):1824–1829

Guastella AJ, Mitchell PB, Dadds MR (2008a) Oxytocin increases

gaze to the eye region of human faces. Biol Psychiat 63(1):3–5

Guastella AJ, Mitchell PB, Mathews F (2008b) Oxytocin enhances

the encoding of positive social memories in humans. Biol

Psychiat 64(3):256–258

Hagen EH, Bryant GA (2003) Music and dance as a coalition

signaling system. Hum Nat Int Bios 14(1):21–51

Hove MJ, Risen JL (2009) It’s all in the timing: interpersonal

synchrony increases affiliation. Soc Cogn 27(6):949–960

Iacoboni M (2005) Neural mechanisms of imitation. Curr Opin

Neurobiol 15(6):632–637

Isen AM (1970) Success, failure, attention, and reaction to others: the

warm glow of success. J Pers Soc Psychol 15(4):294–301

Issartel J, Marin L, Cadopi M (2007) Unintended interpersonal co-

ordination: ‘‘Can we march to the beat of our own drum?’’.

Neurosci Lett 411(3):174–179

Kirschner S, Tomasello M (2009) Joint drumming: social context

facilitates synchronization in preschool children. J Exp Child

Psychol 102(3):299–314

Kleinke CL (1986) Gaze and eye contact: a research review. Psychol

Bull 100(1):78–100

Cogn Process

123

Page 11: Synchronising movements with the sounds of a virtual partner enhances partner likeability

Kokal I, Engel A, Kirschner S, Keysers C (2011) Synchronized

drumming enhances activity in the caudate and facilitates

prosocial commitment-if the rhythm comes easy. PLoS ONE

6(11):e27272. doi:10.21371/journal.pone.0027272

Konvalinka I, Vuust P, Roepstorff A, Frith CD (2010) Follow you,

follow me: continuous mutual prediction and adaptation in joint

tapping. Q J Exp Psychol 63(11):2220–2230

LaFrance M (1979) Nonverbal synchrony and rapport—analysis by

the cross-lag panel technique. Soc Psychol 42(1):66–70

Lakens D (2010) Movement synchrony and perceived entitativity.

J Exp Soc Psychol 46(5):701–708

Lakin JL, Jefferis VE, Cheng CM, Chartrand TL (2003) The

chameleon effect as social glue: evidence for the evolutionary

significance of nonconscious mimicry. J Nonverbal Behav

27(3):145–162

Launay J, Dean RT, Bailes F (2013) Synchronization can influence

trust following virtual interaction. Exp Psychol 60(1):53

Madison G, Merker B (2002) On the limits of anisochrony in pulse

attribution. Psychol Res 66(3):201–207

Mardia KV, Jupp PE (2000) Directional statistics. Wiley, Chichester

Miles LK, Griffiths JL, Richardson MJ, Macrae CN (2009a) Too late

to coordinate: contextual influences on behavioral synchrony.

Eur J Soc Psychol 40(1):52–60

Miles LK, Nind LK, Macrae CN (2009b) The rhythm of rapport:

interpersonal synchrony and social perception. J Exp Soc

Psychol 45(3):585–589

Miles LK, Lumsden J, Richardson MJ, Macrae NC (2011) Do birds of

a feather move together? Group membership and behavioral

synchrony. Exp Brain Res 211(3–4):495–503

Modigliani A (1971) Embarrassment, facework, and eye contact:

testing a theory of embarrassment. J Pers Soc Psychol

17(1):15–24

Moukheiber A, Rautureau G, Perez-Diaz F, Soussignan R, Dubal S,

Jouvent R, Pelissolo A (2009) Gaze avoidance in social phobia:

objective measure and correlates. Behav Res Ther

48(2):147–151

Ollen J (2006) A criterion-related validity test of selected indicators

of musical sophistication using expert ratings. http://www.

ohiolink.edu/etd/view.cgi?osu1161705351. The Ohio State

University

Oullier O, De Guzman GC, Jantzen KJ, Lagarde J, Kelso JAS (2008)

Social coordination dynamics: measuring human bonding. Soc

Neurosci 3(2):178–192

Overy K, Molnar-Szakacs I (2009) Being together in time: musical

experience and the mirror neuron system. Music Percept

26(5):489–504

Pinkham AE, Hopfinger JB, Pelphrey KA, Piven J, Penn DL (2008)

Neural bases for impaired social cognition in schizophrenia and

autism spectrum disorders. Schizophr Res 99(1–3):164–175

Rizzolatti G (2005) The mirror neuron system and its function in

humans. Anat Embryol 210(5):419–421

Sebanz N, Bekkering H, Knoblich G (2006) Joint action: bodies and

minds moving together. Trends Cogn Sci 10(2):70–76

Semin GR, Cacioppo JT (2008) Grounding social cognition:

synchronization, coordination and co-regulation. In: Semin GR,

Smith ER (eds) Embodied grounding: Social, cognitive, affective

and neuroscientific approaches. Cambridge University Press,

Cambridge, pp 119–147

Shimojo S, Simion C, Shimojo E, Scheier C (2003) Gaze bias both

reflects and influences preference. Nat Neurosci 6(12):1317–1322

Sommerville JA, Decety J (2006) Weaving the fabric of social

interaction: articulating developmental psychology and cognitive

neuroscience in the domain of motor cognition. Psychon Bull

Rev 13(2):179–200

Steinbeis N, Koelsch S (2009) Understanding the intentions behind

man-made products elicits neural activity in areas dedicated to

mental state attribution. Cereb Cortex 19(3):619–623

Tognoli E, Lagarde J, DeGuzman GC, Kelso JAS (2007) The phi

complex as a neuromarker of human social coordination. P Natl

Acad Sci USA 104(19):8190–8195

Valdesolo P, DeSteno D (2011) Synchrony and the social tuning of

compassion. Emotion 11(2):262–266

Watkins K, Strafella A, Paus T (2003) Seeing and hearing speech

excites the motor system involved in speech production.

Neuropsychologia 41(8):989–994

Wiltermuth SS (2012) Synchrony and destructive obedience. Soc

Influ 7(2):78–89

Wiltermuth SS, Heath C (2009) Synchrony and cooperation. Psychol

Sci 20(1):1–5

Cogn Process

123


Recommended