+ All Categories
Home > Documents > A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social...

A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social...

Date post: 22-Aug-2020
Category:
Upload: others
View: 2 times
Download: 0 times
Share this document with a friend
16
©Copyright JASSS Alexis Kirke and Eduardo Miranda (2015) A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social Hierarchy and Are Generated by Agent Communications Journal of Artificial Societies and Social Simulation 18 (2) 16 <http://jasss.soc.surrey.ac.uk/18/2/16.html> Received: 29-Nov-2013 Accepted: 26-Oct-2014 Published: 31-Mar-2015 Abstract In this article a multi-agent system is presented which generates melody pitch sequences with a hierarchical structure. The agents have no explicit melodic intelligence and generate the pitches as a result of artificial emotional influence and communication between agents, and the melody's hierarchical structure is a result of the emerging agent social structure. The system is not a mapping from multi-agent interaction onto musical features, but actually utilizes music for the agents to communicate artificial emotions. Each agent in the society learns its own growing tune during the interaction process. Experiments are presented demonstrating that diverse and non-trivial melodies can be generated, as well as a hierarchical musical structure. Keywords: Social Networks, Music, Emotion Introduction 1.1 The generation of novel music is at the heart of many computer-aided composition (CAC) systems. Without some way of generating new material, a CAC will churn out the same material time after time. To avoid this, many systems utilize random numbers. A more recent alternative is the generation of complex structures which are ordered but unpredictable. Popular types of systems that generate structures with such complexity are found in the field of artificial life or A-Life (Brown 2002). A-Life investigates systems related to life, their processes, and evolution; it does this most often through computer simulations and models – for example cellular automata. Many A-life systems have two elements in common with have made them attractive to composers for use in CAC: they generate complex data with order and structure, and they inspire composers by the variety of patterns in the data (Panzarasa & Jennings 2006). So although A-Life systems can generate unexpected behaviour, there is an inherent order – they are not solely random. This is often called emergent behaviour. 1.2 One field which has a large intersection with artificial life is multi-agent systems (MAS), which is one of the 2 key areas utilized in this article. Each agent in an MAS is a digital entity which can interact with other agents to solve problems as a group, though not necessarily in an explicitly co- ordinated way. What often separates agent-based approaches from normal object-oriented or modular systems is their emergent behaviour (Dahlstedt & McBurney 2006). The solution of the problem tackled by the agents is often generated in an unexpected way due to their complex interactional dynamics, though individual agents may not be that complex. As with the application of other A-Life systems in CAC, these social dynamics can be both artistically functional – for example each agent in an ensemble can contribute a motif or play an artificial instrument in a piece of music; or artistically motivational, inspiring an algorithmic composer to produce the music of artificial societies. 1.3 In this article a multi-agent system is presented which generates melody pitch sequences with a hierarchical structure. The agents have no explicit melodic intelligence and generate the pitches as a result of artificial emotional influence and communication between agents, and the music's hierarchical structure is a result of the emerging agent social structure. Another key element is that the system is not a mapping from multi-agent interaction onto musical features, but actually utilizes music for the agents to communicate artificial emotions. Each agent in the society learns its own growing tune during the interaction process. Related Work 2.1 A number of systems with similarities to the one in this paper are now examined in detail. Before doing that, a brief overview of more general multi- agent music systems is given using Table 1. These are not examined in detail but the table is designed to give quick familiarity with a number of key issues found in musical multi-agent systems. The fields will now be explained. Complexity describes the level of processing in individual agents, how complex are they? Homog / Het indicates whether the agents in the MAS are homogeneous or heterogeneous – i.e. do agents all start out the same, or are some different? Comm indicates whether the agents communicate, and if so do they do it synchronously or asynchronously; i.e. do they take it in turns to communicate and process, or do they do it concurrently? Initial Hierarchy describes whether there is a hierarchy of planning/control for the agents; are some agents dependent on others? Can some agents control others? Tune indicates whether the system generates multiple composition alternatives when it completes processing, or a single composition. Real-time describes whether when the agents are activated, the music generated in real-time. Size gives – where available and relevant – the number, or average number, of agents in the system. Finally Model / Func indicates whether the system is designed solely to model some element of music, or as a computer-aided composition system. Many of the above properties are also key defining features of non-musical MAS. http://jasss.soc.surrey.ac.uk/18/2/16.html 1 20/10/2015
Transcript
Page 1: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

©CopyrightJASSS

AlexisKirkeandEduardoMiranda(2015)

AMulti-AgentEmotionalSocietyWhoseMelodiesRepresentitsEmergentSocialHierarchyandAreGeneratedbyAgentCommunications

JournalofArtificialSocietiesandSocialSimulation 18(2)16<http://jasss.soc.surrey.ac.uk/18/2/16.html>

Received:29-Nov-2013Accepted:26-Oct-2014Published:31-Mar-2015

Abstract

Inthisarticleamulti-agentsystemispresentedwhichgeneratesmelodypitchsequenceswithahierarchicalstructure.Theagentshavenoexplicitmelodicintelligenceandgeneratethepitchesasaresultofartificialemotionalinfluenceandcommunicationbetweenagents,andthemelody'shierarchicalstructureisaresultoftheemergingagentsocialstructure.Thesystemisnotamappingfrommulti-agentinteractionontomusicalfeatures,butactuallyutilizesmusicfortheagentstocommunicateartificialemotions.Eachagentinthesocietylearnsitsowngrowingtuneduringtheinteractionprocess.Experimentsarepresenteddemonstratingthatdiverseandnon-trivialmelodiescanbegenerated,aswellasahierarchicalmusicalstructure.

Keywords:SocialNetworks,Music,Emotion

Introduction

1.1 Thegenerationofnovelmusicisattheheartofmanycomputer-aidedcomposition(CAC)systems.Withoutsomewayofgeneratingnewmaterial,aCACwillchurnoutthesamematerialtimeaftertime.Toavoidthis,manysystemsutilizerandomnumbers.Amorerecentalternativeisthegenerationofcomplexstructureswhichareorderedbutunpredictable.PopulartypesofsystemsthatgeneratestructureswithsuchcomplexityarefoundinthefieldofartificiallifeorA-Life(Brown2002).A-Lifeinvestigatessystemsrelatedtolife,theirprocesses,andevolution;itdoesthismostoftenthroughcomputersimulationsandmodels–forexamplecellularautomata.ManyA-lifesystemshavetwoelementsincommonwithhavemadethemattractivetocomposersforuseinCAC:theygeneratecomplexdatawithorderandstructure,andtheyinspirecomposersbythevarietyofpatternsinthedata(Panzarasa&Jennings2006).SoalthoughA-Lifesystemscangenerateunexpectedbehaviour,thereisaninherentorder–theyarenotsolelyrandom.Thisisoftencalledemergentbehaviour.

1.2 Onefieldwhichhasalargeintersectionwithartificiallifeismulti-agentsystems(MAS),whichisoneofthe2keyareasutilizedinthisarticle.EachagentinanMASisadigitalentitywhichcaninteractwithotheragentstosolveproblemsasagroup,thoughnotnecessarilyinanexplicitlyco-ordinatedway.Whatoftenseparatesagent-basedapproachesfromnormalobject-orientedormodularsystemsistheiremergentbehaviour(Dahlstedt&McBurney2006).Thesolutionoftheproblemtackledbytheagentsisoftengeneratedinanunexpectedwayduetotheircomplexinteractionaldynamics,thoughindividualagentsmaynotbethatcomplex.AswiththeapplicationofotherA-LifesystemsinCAC,thesesocialdynamicscanbebothartisticallyfunctional–forexampleeachagentinanensemblecancontributeamotiforplayanartificialinstrumentinapieceofmusic;orartisticallymotivational,inspiringanalgorithmiccomposertoproducethemusicofartificialsocieties.

1.3 Inthisarticleamulti-agentsystemispresentedwhichgeneratesmelodypitchsequenceswithahierarchicalstructure.Theagentshavenoexplicitmelodicintelligenceandgeneratethepitchesasaresultofartificialemotionalinfluenceandcommunicationbetweenagents,andthemusic'shierarchicalstructureisaresultoftheemergingagentsocialstructure.Anotherkeyelementisthatthesystemisnotamappingfrommulti-agentinteractionontomusicalfeatures,butactuallyutilizesmusicfortheagentstocommunicateartificialemotions.Eachagentinthesocietylearnsitsowngrowingtuneduringtheinteractionprocess.

RelatedWork

2.1 Anumberofsystemswithsimilaritiestotheoneinthispaperarenowexaminedindetail.Beforedoingthat,abriefoverviewofmoregeneralmulti-agentmusicsystemsisgivenusingTable1.Thesearenotexaminedindetailbutthetableisdesignedtogivequickfamiliaritywithanumberofkeyissuesfoundinmusicalmulti-agentsystems.Thefieldswillnowbeexplained.Complexitydescribesthelevelofprocessinginindividualagents,howcomplexarethey?Homog/HetindicateswhethertheagentsintheMASarehomogeneousorheterogeneous–i.e.doagentsallstartoutthesame,oraresomedifferent?Commindicateswhethertheagentscommunicate,andifsodotheydoitsynchronouslyorasynchronously;i.e.dotheytakeitinturnstocommunicateandprocess,ordotheydoitconcurrently?InitialHierarchydescribeswhetherthereisahierarchyofplanning/controlfortheagents;aresomeagentsdependentonothers?Cansomeagentscontrolothers?Tuneindicateswhetherthesystemgeneratesmultiplecompositionalternativeswhenitcompletesprocessing,orasinglecomposition.Real-timedescribeswhetherwhentheagentsareactivated,themusicgeneratedinreal-time.Sizegives–whereavailableandrelevant–thenumber,oraveragenumber,ofagentsinthesystem.FinallyModel/Funcindicateswhetherthesystemisdesignedsolelytomodelsomeelementofmusic,orasacomputer-aidedcompositionsystem.Manyoftheabovepropertiesarealsokeydefiningfeaturesofnon-musicalMAS.

http://jasss.soc.surrey.ac.uk/18/2/16.html 1 20/10/2015

Page 2: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

2.2 Thesysteminthisarticleisanon-realtimesystemwhichworkswithasmalltomediumnumberofagents–i.e.nothundredsofagents,itgeneratesmultipletunesinparallel,anditisfocusedoncomputer-aidedcompositionnotonmodellingthecompositionprocessormusicalculture.

Table1:MusicalMulti-AgentSystems

System Complexity Homog/Het

Comm Tune InitialHierarchy

Realtime

Size Model/Func

SwarmMusic(Blackwell&Bentley2002)

Low Het No 1 Flat Y 21 F

AntColonyMusic((Clairetal.2008) Low Homog No 1 Flat Y FSwarmOrchestra(Bisig&Neukom2008)

Low Homog No 1 Flat Y F

SocietyofMusicAgents(Beyls2007 Low Homog Sync 1 Flat N FMMAS(Wulfhostetal.2003) Higher Het ASync 1 Flat Y 8 FMusicalAgents(Fonseka2000) Higher Het Async 1 Flat Y FAndante(Ueda&Kon2003) Higher Het Async 1 Flat Y FVirtuaLatin(Murray-Rustetal.2005) Higher Het Sync 1 Hierarchy N 1 FMAMA(Murray-RustandSmaill2005) Higher Het ASync 1 Hierarchy Y FKineticEngine(Eigenfeldt2009) Higher Het ASync 1 Hierarchy Y FCinBalada(Sampaioetal.2008) Higher Het ASync 1 Flat N FAALIVE(Spiceretal.2003) Higher Het ASync 1 Hierarchy Y F

2.3 Thesystemswhichareclosesttotheoneinthisarticle(andnotlistedinTable1)arenowexaminedinmoredetail.TheDahlstedtandMcBurney(2006)systemusesagentswhichhavedifferentexplicitgoalsthatrepresentdifferentpartsoftheprocessofmusiccomposition.Anexampleisgivenofanagentwhosegoalistoreducesoundobjectdensityifthepopulationofthesystem'ssoundlandscapebecomestoocluttered;anotherisgivenofanagentwhodoestheopposite.Bothagentswouldtakeintoaccountthemusicalcontextwhiledoingthis.Theresearchersexplicitlyintendtoutiliseemergencetogenerateinterestingmusic.Thisisasimilaritywiththesysteminthisarticle,thoughkeydifferencesare:theDahlstedtandMcBurneyagentsactonasinglemusiccompositiontogether,whereasagentsinthisarticleeachhavetheirownrepertoireswhichcandevelopinparallel,anddonothaveexplicitanddistinctgoals.

2.4 Miranda's(2002)systemgeneratesmusicalmotifsinawaydesignedtostudytheevolutionofculture.Inthiscasetheagentsuseatwo-wayimitationproceduretobondsocially.Agentscanstorearepertoireoftunesandhaveabasicbiologicalmodelofanadaptivevoiceboxandauditorysystem.Agentspickotheragentstointeractwithrandomly.

2.5 WhentwoagentsAandBinteractthefollowingprocessoccurs:ifagentAhastunesinitsrepertoireitpicksonerandomlyandsingsit,ifnotthenitsingsarandomtune.Thesetunesarethreenoteslonganddonotgrowinlength.AgentBcomparesthetunefromAtoitsownrepertoireandifitfindsonesimilarenough,playsitbacktoagentBasanattemptedimitation.ThenagentBmakesajudgementabouthowgoodtheimitationis.Ifitissatisfiedwiththeimitationitmakesa"re-assuring"noisebacktoagentA,otherwiseitdoesnot.BasedonthesuccessoftheimitationAgentsAandBupdatetheirrepertoiresandtheirvoiceboxsettingstotrytoimprovetheirchancesofsociallybondinginlaterinteractions–e.g.bydeletingorre-enforcingtunes,ormakingrandomdeviationstotheirvoiceboxparameters.Theaimofthesystemistoseehowtherepertoireisgeneratedandaffectedundersuchsocialpressures.Asaresultofthesocialbondinginteractionsacommunityrepertoirewasfoundtoemerge.

2.6 Gongetal.(2005)producedasimplemusicgeneratingsystemwithasimilarpurposetoMiranda( 2002)–investigatingtheemergenceofmusicalculture.Theagentsstartwithasetofrandommotifs,togetherwithdifferentagentsbeingequippedwithdistinctbutverysimpleaestheticevaluationfunctions(forrhythm,pitch,etc.).Anagentplaysitstunetoanotheragentandifthesecondagentfindsthetuneunpleasant,itmodifiesit(basedonitsmusicalevaluation),andplaysitbacktothefirstagent.Ifthefirstagentthinksthemodifiedtuneisbetterthanitsoriginal,itdeletesitsoriginalandstoresthemodifiedversion.Asagentsinteractthisleadsto"morepleasant"motifsemerging.Also,usinganinteraction-historymeasure,thesociallinkbetweenfirstandsecondagentisstrengthenedsothattheyaremorelikelytointeractinthefuture.Howeverifthefirstagentdoesnotpreferthemodifiedtunetoitsownversion,itdiscardsitandthelinkbetweenthetwoagentsisnotstrengthened.Itwasfoundthatintheemergentsocialnetworktheagentstendedtoclusteraccordingtotheiraestheticpreferencefunction.Thissystemhasacoupleofsimilaritiestotheoneinthisarticle:itutilizesMASsocialnetwork/trusttechniques(Ramchurnetal.2004)todecidewhointeractswithwhom,andineachinteractionagentsvarytheirrepertoirebasedontheiropinionoftheotheragent'srepertoire.Thekeydifferencesbetweenthissystemandtheoneinthisarticleisthatagentsinthisarticlehavenoexplicitevaluativemelodicintelligence,andtheycanextendthenumberofnotesintheirrepertoire;andfinallythesocialnetworkinthisarticleisusedtogeneratehierarchicalmusicstructurewithinanagent'srepertoirenottoexperimentwiththeclusteringofagentsaccordingtotheirrepertoires.

2.7 TheA-Rhythm(Martins&Miranda2007)systemsetsouttoexaminetheapplicationofmulti-agentsystemstoalgorithmiccomposition.Currentreportsfocuson,likeMirandaandGongetal.,investigatingtheemergenceofsocialclusters,andaresolelybasedonrhythmicrepertoire.A-Rhythmhassomesimilaritiestothesysteminthisarticle:theagentscommunicateandprocessoneatatimeserially,ratherthaninparallel,andtheirmusicalcontentgrowslonger.HoweverA-Rhythmfocusesonrhythm,i.e.isnon-pitched.Alsothesimilaritymeasuresaremoredirectlybasedonthemusic,ratherthanaffectivecontentofthemusic.FinallyA-Rhythmusesmeasuresforthepopularityofrhythmsinanagent'srepertoire,butnotforthepopularity/trustofagents.Agentsinthissystemcantransformtheirrepertoiresbasedoninteraction–usingcertainrhythmictransformationrules,ratherthantheaffective-basedtransformationsusedinthisarticle.Anumberofexperimentsaredonebasedondifferentinteractionapproaches,andtheresultingpopulationandrepertoiredynamicsareexamined,showingthepotentialfortheemergenceofstructuredrhythmicrepertoires.

Multi-agentAffectiveSocialCompositionSystem:MASC

Overview

http://jasss.soc.surrey.ac.uk/18/2/16.html 2 20/10/2015

Page 3: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

3.1 Thesysteminthisarticle–themulti-agentaffectivesocialcompositionsystem(MASC)–isnowpresentedinoverview.AgentsinMASCareinitializedwithatunecontainingasinglenote,andovertheinteractionperiodeachagentbuildslongertunesthroughinteraction.Figure1showsanoverviewrepresentationofacollectionofagents.

Figure1.SixMASCagentsinavarietyofaffectivestateswithoneagentperforming.

3.2 Thefollowingaresomeofthekeyfeaturesofthesystem.MASCusuallyconsistsofasmall-mediumsize–2to16–collectionofagents,butcanbemore.EachagentcanperformmonophonicMIDItunesandlearnmonophonictunesfromotheragents.Anagenthasanaffectivestate,anartificialemotionalstatewhichaffectshowitperformsthemusictootheragents;e.g.a"happy"agentwillperformtheirmusicmore"happily".Anagent'saffectivestateisinturnaffectedbytheaffectivecontentofthemusicperformedtoit;e.g.if"sad"musicisperformedtoahappyagent,theagentwillbecomealittle"moresad".Agentscanbemadetoonlylearntunesperformedtothemiftheaffectivecontentofthetuneissimilarenoughtotheircurrentaffectivestate;learnedtunesareaddedtotheendoftheircurrenttune.Agentsdevelopopinions/trustofotheragentsthatperformtothem,dependingonhowmuchtheotheragentscanhelptheirtunesgrow.Theseopinionsaffectwhotheyinteractwithinthefuture.

AffectiveModels

3.3 Beforegoingintothedatastructureswithineachagentindetail,theissueofaffectivemodelswillbecovered.ThereisavarietyofapproachesforaffectiverepresenationwhichcanbebroadlydividedintheDimensionaltypeandtheCategorytype(Zentneretal.2008).Categoryapproachesrangefrombasicemotiondefinitions–whichassumesthatsomeemotionsaremorefundamentalthanothersandattemptstolistthese;tothemoreeverydayemotionlabelsystems–whichdonotattempttocategorizebasedonanemotionhierarchy.Arecentcategory-basedapproachforemotionistheGenevaEmotionMusicScales(GEMS)approach(Zentneretal.2008)whichattemptstoprovidecategoriesoptimalformusicalemotion.This

http://jasss.soc.surrey.ac.uk/18/2/16.html 3 20/10/2015

Page 4: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

isdonebyfirstinvestigatingthroughpsychologicaltestswhichsortsofemotionaremostcommonlyexpressedtopeoplebymusic.AlthoughGEMSdoesgetuserstoscorethecategorywithanintegerfrom1to5,thefactithasupto45categoriesputsitmoreintherealmofcategoricalthanthedimensionalsystemsnowdiscussed.

3.4 TheDimensionalapproachtospecifyingemotionutilizesann-dimensionalspacemadeupofemotion"factors".Anyemotioncanbeplottedassomecombinationofthesefactors.Forexample,inthispaper,twodimensionsareused:ValenceandArousal(Lang1995).Inthismodel,emotionsareplottedonagraphwiththefirstdimensionbeinghowpositiveornegativetheemotionis(Valence),andtheseconddimensionbeinghowintensethephysicalarousaloftheemotionis(Arousal).ThisisshowninFigure2.Justascategoryapproacheswouldnotclaimtolistallpossibleemotions,sodimensionalapproachesdonotclaimtobecomplete.Itisnotknownifemotionscanbepinpointedbasedonuniqueindependentdimensions.Otherdimensionalapproachesincludethethreedimensionalvalence/arousal/dominancesystem(Oehmeatal.2007).InthiscaseValenceandArousalhavethesamemeaningasinthe2Dversion.Howeverinthe2DapproachFearandAngerarebothlowvalence,higharousal.Inthe3Dversion,Dominancedifferentiatesemotionssuchasanger(highdominance)andfear(lowdominance);angercanbeseenasmoreofanactiveemotion,fearasmoreofare-activeone.TherearealsoexamplessuchasCanazzaetal.(2004)whereatask-specificmood-spaceisconstructedforexpressiveperformanceusingexperimentsandprinciplecomponentanalysis.Inthatparticularcasethedimensionsarenotexplicit.

Figure2.TheValence/ArousalModelofEmotion

AgentDataStructures

3.5 Eachagentcontainsthreedatastructures.Thefirstisanagenttune,amonophonictuneinMIDIformat.Thesecondisanagentaffectivestate–anumberpair[valence,arousal]representingtheartificialaffectivestateoftheagentbasedonthevalence/arousalmodelofaffectivity.Thisisthemostcommondimensionalaffectiverepresentationincomputermusic.AshasbeenmentionedValencereferstothepositivityornegativityofanemotion–e.g.ahighvalenceemotionisjoyorcontentment,alowvalenceoneissadnessoranger.Arousalreferstothearousalleveloftheemotion–forexamplejoyhasahigherarousalthanhappiness,thoughbothhavehighvalence,andangerahigherarousalthansadness,thoughbothhavelowvalence.Alinguisticelementneedstobeclarified.TheuseofaffectivelabelssuchashappyandsadareusedtoassistclarityinintroducingthereadertotheconceptsofMASC;theyarenotmeanttobetakenliterally.Forexamplehappyreferstoaregionofhighvalenceandarousalvalues,andsadreferstoaregionoflowvalenceandarousalvalues.Thesamegoesforanywordswhichmayseemtoimplythatagentshaveanykindofpersonification,ordeeperintentionalorbiologicalmodel.Suchlanguageismerelyashorthandforclarifyingfunctionality.

3.6 Thethirdandfinalagentdatastructureisaninteractioncoefficientlist,whichisalistofinteractioncoefficientsofalltheotheragentsinthecollection.Thesearenon-negativefloatingpointnumberswhichmeasurehowpopulartheagentfindseachoftheotheragents.Theconceptofinteractioncoefficientisusedheretoattempttocreateemergentcompositionalhierarchies,aswillbedemonstrated.Anotherwayofthinkingofinteractioncoefficientatthispointistoconsideranimaginedmotivationforanagent.TheaimofMASCis–startingwitheachagenthavingasinglenote–tobuildactualmelodies.Soanagentshouldwantnotes.AnagentA'sinteractioncoefficientmeasureofanother,sayAgentB,isbasedonthenotecountandnumberofperformancesithasaddedfromBtoitsowntune.

3.7 Anagentalsohasanumberofinternalprocessingfunctions.Theperformanceoutputchoicefunctioninvolvesanagentchoosingwhotoperformto,basedontheagent'sinteractioncoefficientlistofotheragents.Itwillonlyperformtoagentsitfindsusefulenough.Theperformanceoutputtransformfunctioninvolvestheagentplayingitssinglestoredtuneasaperformancetoanotheragent,withmusicalfeaturesbasedonitsowncurrentaffectivestate.Theperformanceinputestimatefunctionallowstheagenttoestimatetheaffectivecontentofatuneperformedtoitbyanotheragent,andadjustitsowninternalaffectivestatebasedontheaffectivecontent.Anagent'sperformanceinputchoicefunctioninvolvesitdecidingwhethertostore

http://jasss.soc.surrey.ac.uk/18/2/16.html 4 20/10/2015

Page 5: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

aperformancefromanotheragent,andisbasedon:(a)theaffectivecontentofthatperformance,and(b)howmanynotesareinthelisteningagent'scurrenttune–anagenthasafinitetunelengthmemorywhichcanfillup.Theperformanceinputinteractioncoefficientfunctionletstheagentupdateitsinteractioncoefficientmeasureofanotheragentbasedonthatagent'sperformance.Finallytheperformanceinputaddfunctionletstheagentstoreaperformancebyconcatenatingittotheendofitscurrenttune.AnexampleinteractioncycleisshowninFigure3.Thiscycleisrepeateduntilthedesiredcompositionalresultisreached.

Figure3.ExampleInteractionCycle

PerformanceOutputTransformFunction

4.1 Thefunctionforperformanceoutputtransformwillnowbeexaminedinmoredetail.Beforeperformingitstunetoanotheragent,anagentwilltransformitstuneinacompositionalway.

CompositionalTransforms

4.2 Twotypesofcompositionaltransformationsareapplied–linearfeaturetransformsandakeymodetransformintoCmajororCminor.Theaimofthisworkwasittoinvestigatetheeffectsofmulti-agentemergenteffects,ratherthanthetransformationsthemselves–hencethetransformationswerekeptassimpleaspossible,forgoingnon-linearityandpsychophysicalaccuracy.Theycanbecomparedtothesimplisticlinearrulesusedinswarmorflockingsystems(Reynolds1987).Clearlysuchlinearrulesareanover-simplificationofbird/insectbiologyandpsychology.However,whatisofinterestisthatsuchsimplerulescancreatesuchcomplexdynamics.Theunderlyingelementsbeingsimplifiedherearetherelationshipsbetweenmusicandemotion.Thisareahasbeenmeta-surveyedinLivingstoneetal.(2010),whichthenestablishedaseriesofrulesfortransformingmusictoexpressemotions.Theserulesareshownintable2.

Table2:TransformationRulesproposedin(Livingstoneetal.2010)

EmotionLabel Valence Arousal FeaturesTempo Loudness Pitch Keymode

Happy Higher Higher Increase10BPM Increase5db +4 MajorAngry Lower Higher Increase10BPM Increase7db 0 MinorSad Lower Lower Decrease15BPM Decrease5db -4 Minor

http://jasss.soc.surrey.ac.uk/18/2/16.html 5 20/10/2015

Page 6: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

Tender Higher Lower Decrease20BPM Decrease7db +4 Major

4.3 Equations(1)to(4)showtheresultoftransformingtheseintosimplifiedlinearrulesformulti-agentsysteminteraction.Theprocessofthissimplificationisexplainedbelow,aftertheirpresentation.WhenanagentAisabouttoperformandhasaparticularlevelofvalence,writtenvalenceA,andarousal,writtenarousalA,itwillfirstcompositionallytransformitsstoredtunebasedontheeffectsofequations(1)to(4).Theprimedvaluesonthelefthandsideoftheequationsarethedefiningfeaturesofthecompositionallytransformedmusic,andareusedtounambiguouslygenerateatransformedMIDIfile.Thepre-transformationvaluesIOIi(A),duri(A),loudi(A),andpitchi(A)are:theinter-onsetintervalbetweennoteiandthenextnotei+1,thenotedurationinseconds,theMIDIloudness,andMIDIpitchofthei-thmusicalnoteofAgentA'sstoredtune.Thethetavalues–θonset,θloud,andθpitch–definetheaffectivesensitivityofthetransformation–i.e.howmucheffectachangeinAgentA'svalenceorarousalwillhaveonthetransformation.Theyarethemaximumvariationpercentagebarsaroundthecurrentfeaturevalue.

IOIi(A)'=IOIi(A)(1−θIOIarousalA (1)

duri(A)'=duri(A)(1−θIOIarousalA) (2)

(3)

(4)

4.4 ForexampleifθIOIis0.25,thenbyEquation(1)theonsetwillvaryfrom25%belowitscurrentvalueto25%aboveitscurrentvaluewhenarousalvariesfrom-1to1.IfatransformationgoesabovethemaximumMIDIvalue(127)thenitissetto127.Similarlyifitgoesbelow1itissetto1.NotethatθIOIisusedbothforonsetsanddurationsothatasgapsbetweennotesareincreasedordecreased,thedurationofthesamenotesisincreasedanddecreasedbythesameamount.

4.5 ThemappingbetweenTable2andEquations(1)to(4)isexplainedasfollows.Asmalleraverageinter-onsetintervalinapieceofmusicleadstoahighertempo(Dixon2010),andequation(1)meansthatahigherarousalwillcreateasmallerinter-onsetinterval,andthusahighertempo.ThisapproximatelycapturesthelineardynamicsofthetempocolumninTable2wheretempochangesinthesamedirectionasarousal,butisnotaffectedbyvalence.Durationin(2)ischangedproportionaltointer-onsetinterval–sothatwhennotesareclosertogether(duetoahighertempo)theywillbeproportionallyshorter,aswouldbeexpected.

4.6 Equation(3)isalinearsimplificationofthefactthatTable2showshowchangingvalencetendstocauseloudnesstoincreaseinthesamedirection,andchangingarousalalsotendstocauseloudnesstoincreaseinthesamedirection.Equation(4)isbasedonthefactthatTable2showspitchtobechangedbychangesinvalenceandarousal,butslightlymoresobychangesinvalence.

4.7 Table2wasnotoriginallyaprescriptionforalinearmodel,inthesenseitsaysnothingaboutwhathappensinbetweenthefourstates.Evenifthevalence/arousalmodelofemotionwascomplete(whichclearlyitisnot)therewouldcertainlybenon-linearbehaviorbetweentheoriginandthefourpointsreferencedinthistable.Thisisonereasonwhynoattemptwasmadeintheequationstocalculatepreciselinearcorrelationcoefficients.

4.8 SotheequationsarenotmeanttobeanaccuratelinearmodelofthebehaviorinTable2,butthesimplestpossiblelinearmodelofmusicfeaturesandemotioninformedbyTable2.Themodeldoesnotclaimthatallhighpitchedmusicishighervalence,orthatalllowtempomusicislowarousal.Contra-examplescanbefoundforbothoftheseinmusic:forexamplehighpitchedmelancholyviolins,orslowbutgrandandinspiringorchestralpieces.Therearehowevernocompletemodelsformusicandemotion,justastherearenocompletemodelsforbirdandinsectflocking.Forexample,youcankeepaddingorremovingbirdsfromaflockingmodel,andtheflockwillincreaseordecreaseinsizeinanunlimitedway–obviouslyacontra-exampleoftheflockingmodel.Henceitisarguedthattheincompletenessoftheabovelinearmodelisacceptableforthepurposesoftheworkherepresented.

4.9 Onemusicfeaturewhichcannotbechangedlinearlyiskeymode,soadifferent–butstillsimple–approachisused.ItislargelybasedonTable2butwithoneadjustment.Forpositiveemotionamajorkeyisutilizedandfornegativevalencewithnegativearousal(e.g.sadness),aminorkeyisutilized.Fornegativevalenceandpositivearousal–e.g.angerorfear–eachnoteinthetuneistransformedtoCminorthenmovedalternatelyupordownasemitone;thisisdesignedtoinjectanatonalelementtothemusic.Forexamplethesequence"CEbDFEbGC"wouldbecome"DbDEbEEGbDb".Thisisbasedontheideathatfearandangercanberepresentedbyatonality(Chongetal2013).ThiswillimpacttheeffectofEquation(4)whichalsoraisesandlowerspitchduetovalence.Howeverthechangesinpitchduetovalencein(4)–intheexperimentsdetailedlater–areofasignificantlygreaterorderthanonesemi-tone.Thustheimpactoftheatonaltransformationisminimalon(4).Alsoequation(4)isalinearsimplification,sothereisnoclaimtoitbeingaccuratetowithinasemi-toneintermsofitsvalencerepresentation.

4.10 Thetransformisalgorithmicanddeterministic–itsearcheseithersideofthecurrentnotesforanoteinthenewmodewhichdoesnotviolateaMIDIboundary–i.e.notoutoftheMIDI128parameterrange.SosupposeanagentAhasstoredatunefromahappyagentwhichisamajorkey.IfagentAthenperformsitstunewhilesaditwillconvertallofitstune,includingthemajorpartitreceivedfromanotheragent,intotheminormode.Thecurrentversioninthisarticlehasnoabilityforactualkeycompositionfunctionality,hencethereasonforusingonlyCmajorandCminor.

TuneAffectiveEstimationFunction

5.1 Alinearequationisusedtomodelthelisteningagent's,sayagentB,affectiveestimateofaperformancebyagentA–thisisshowninequations(5)and(6).

valenceEstB=xpmean(pitchA)+xlmean(loudA)+xkkeyModeA+xIOImean(IOIA)+x0

(5)

arousalEstB=ypmean(pitchA)+ylmean(loudA)+yIOImean(IOIA)+y0

(6)

http://jasss.soc.surrey.ac.uk/18/2/16.html 6 20/10/2015

Page 7: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

IntheseequationspitchAandloudArefertotheaverageMIDIpitchandMIDIloudnessofanagentA'sperformanceheardbyB.keyModeArepresentsthekeymodeofA'stuneestimatedusingakeyprofile-basedalgorithm(Krumhansl&Kessler1982)definedashavingvalue2foraminorkey,and1foramajorkey.Akeyprofileisavectorof12elements,oneforeachnoteinthescale.Eachkeyhasitsownprofile.Pitcheswhichfitwellintoascalehaveahigherweightinginitskeyprofilevector,andthosewhichfitlesswellhavealowerweighting.TheseweightingswerecalculatedforeachkeyfromperceptualexperimentsinKrumhanslandKessler(1982).Thustheycanbeusedtofindwhetheraparticularsetofnotesfitsbestintoamajororaminorkey.

5.2 ThexandycoefficientsintheEquationsareconstantsestimatedbylinearregression.Theseareestimatedinaone-offprocessasfollows.Asetof1920randomMIDIfileswasgenerated,ofrandomlengthsbetween1and128notes.EachMIDIfilewastransformedfor10givenandequallyspacedvalenceandarousalvaluesbetween−1and1usingtransformationequations(1)to(4),andkeymodetransformations.

5.3 ThenalinearregressionwasrunontheresultingtransformedMIDIfilesagainsttheknownarousalandvalencevalues–basedonequations(5)and(6).Theresultingcoefficientsweretestedonaseparatesetof1920transformedrandomfiles,andtheaveragepercentageerrorswere10%forvalenceand9%forarousal.Theseareconsideredtobesufficientlyaccurategiventhatactualhumanmusicalemotionrecognitionerrorratescanbeashighas23%andotherfarmorecomplexartificialmusicalemotiondetectionsystemshaveratessuchas81%(Legaspietal.2007).Theactualcoefficientsforpitch,loudness,keymodeandIOIwererespectivelyinequations5and6forx=[−0.00214,0.012954,1.1874,−0.6201]andy=[0.003025,0.052129,−1.4301,0.59736];withtheadditiveconstantsforxandyrespectivelyof0.61425and−4.5185.

5.4 Thelinearestimatorisusedintwoaspectsoftheagents–firstlyforanagenttodecidewhetherornottoaddaperformancetoitsowntune,andsecondlyforanagenttobeinfluencedbytheapproximatedemotionalcontentofaperformanceithasheard.Equations(7)and(8)belowareusedtoupdatethevalenceandarousalofagentBafteraperformancefromagentA.Theγ(gamma)constant–between0and1–defineshowsensitiveanagentistoaffectivestatechange–i.e.theamountofchangetovalenceandarousal.Ifitissetto1thenthenewvalenceandarousalvalueswillbetotallychangedtotheestimatedvaluesoftheperformancetheagenthasjustheard.Avalueof0willleadtonochange.Valuesbetween0and1willcausetheestimatetohaveaproportionallygreatereffect.

valence'B=(1−γv)valenceB+γvvalenceEstA (7)

arousal'B=(1−γa)arousalB+γaarousalEstA (8)

5.5 OncetheagentBhasdecidedwhetherornottoappendtheperformancefromA–andifso,hasdoneso–itwillupdateitsvalenceandarousalbasedonEquations(7)and(8).Infuture,whenitnextperformsatune,itwilltransformitbasedonitsnewvalenceandarousalstate.Itisdesignedsothatthroughthisseriesofupdatingaffectivestatesandtheagenttunecommunicationandsystem,newmusicalstructureswillemerge.

5.6 Itisworthtakingamomenttodiscusstheaboveprocess,inparticulartheaffectivecommunicationandestimationelements.AnalternativewouldhavebeenforthesystemtodirectlytransmitthevalenceandarousalofagentAtoagentB,ratherthancomputingcoefficientsforEquations(5)and(6)andgoingthroughtheprocessofestimatingvalenceandarousalfromthemusicfeatures.Theestimatingthecoefficientsfor(5)and(6)couldbeseenasquitearecursiveprocess:firstanapproximationwasmadeofhowvalenceandarousalcanbecommunicatedthroughmusicfeatures,andthenanapproximationwasmadeofhowmusicfeaturescommunicatevalenceandarousal.Wouldn'tithavebeensimplertojustcommunicatethevalenceandarousaldirectly;or–ifwewantedtoobservethemusicaleffects–stillperformthecompositionaltransforms,whichcommunicatingthevalenceandarousaloftheperformingagentdirectly?

5.7 Thesesimplificationswouldhaveremovedtheartificiallifefoundationfromtheformulation.Themusicwouldhavenotbeenpartoftheprocess,butmerelyinfluencedbytheprocess.Whatmakestheformalismofinterestisthatthemusicispartoftheprocess,andthusanycreativeresultsareemergentfromtheprocess.Furthermore,hadtheagentscommunicatedtheirvalenceandarousaldirectly,withsayarandomcommunicationerror,thenitwouldbecomealessnovelmulti-agentsystem.ItwouldhavebeenanMASinwhicheachagenthadtwonumericparametersandagentswhosenumericparameterswereclosewouldtheninfluenceeachother'sparametersmorestrongly.Thisisatypeofartificiallifesystemthathasbeenstudiedmanytimesbefore,andleadstoagentstendingtoclusterintogroupsofsimilarparametervalues–i.e.similarvalenceandarousalvalues.Byputtingthemusicatthecentreoftheparametercommunicationnotonlydoesitcreateanovelartificiallifeprocessbutalsomakesthemusicemergenttotheprocess.ThussomethingislearnedaboutthedynamicsofanewALsystem,andalsoabouthowapplicablethatsystemmightbetoalgorithmiccomposition.

5.8 Attheheartoftheexperimentsarethequestions:canmusicalcommunicationinmulti-agentsystemsbeusedasaformofalgorithmiccomposition?Music,incommunicationterms,ismostcommonlyconsideredaformofemotionalcommunication(Juslin&Laukka2004).Pastresearchhasmostcommonlylinkedmusicalfeaturestocommunicatedemotion.Howeverthiscommunicationisimperfect:peopledetectemotioninmusicambiguously,andcommunicateinemotionambiguously.Sothesystemasdesignedcapturesmanyelementsofmusicalcommunicationinamorerealisticwaythatsimplycommunicatingvalenceandarousalwithanerror.Itgeneratesmusicalfeaturesbasedontheemotionbeingcommunicated.Thelisteningagentisthenaffectedbythemusicalfeatures.Thiseffectisthesimplestnon-directmethodpossiblebasedonmusicalfeatures:alinearmodelbasedonthenon-invertibleequations(1)to(4).This–althoughsignificantlylesssimple–itistheequivalentintheflockingexampleofagent'smovementandtheirperceptionofeachother'smovement.However,unlikemovement,theprocesshereisnotanimmediatelyvisibleone;itconcernstheagents'emotionswhicharenotsosimplyobservableorcommunicable.Infactevenwhenhumanbeingsattempttocommunicateemotionsdirectly,therearelimitations;whichisoneofthereasonsthattheartsareoftenconsideredtobeformsofemotionalexpression,abletocommunicateinwaysthatlanguageisnot(Juslin&Laukka2004).

PerformanceInputInteractionCoefficientFunction

6.1 BeforeanAgentAperformstoanAgentBitcomparesitsinteractioncoefficientmeasureofAgentBtotheaverageofitsinteractioncoefficient(IC)forotheragents:

IC(A,B)>mean[IC(A,allagents)] (9)

whereIC(A,B)isA'sinteractioncoefficientmeasureofB.Ifitisnot,thenitdoesnotperformtoAgentBandmovesontothenextagent.TheincreaseinInteractionCoefficientisproportionaltothelengthoftuneithasadded.SothemorenotesinAgentB'spastperformances,thegreateritsinteractioncoefficientwillbeviewedbyAgentA.IfagentAaddsatunefromagentBoflengthNnotesthen:

http://jasss.soc.surrey.ac.uk/18/2/16.html 7 20/10/2015

Page 8: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

IC(A,B)=IC(A,B)+d.N (10)

6.2 Theparameterdisaconstantcalledtheinteractioncoefficientupdaterate.Thiscanbevisualisedasanagent'sbasicresourcesbeingtunes–sothemorenotesinanagent'stune,thegreateritspotentialinteractioncoefficienttootheragents.Howevertheactualreasonforincludinginteractioncoefficientfunctionality,andmakinginteractioncoefficientproportionaltothenumberofnotesinaperformingagent'stuneisprimarilytogenerateaninteraction/socialhierarchyamongsttheagentswhichinfluencesthehierarchicalstructureofthecomposedmusic.Bearinginmindthatanagentwillonlyperformtootheragentswithahighenoughinteractioncoefficient,itcanbeseenthatagentswhichperformmorethanlistenwilltendtohavelowerinteractioncoefficients.Furthermoreagentswhichmostlylistenandstorewillhavelongertunesandhigherinteractioncoefficients;andagentswithhigherinteractioncoefficientswilltendtobeselectedaslistenersmoreoften

6.3 Sothesystemisdesignedtoturntheagentpopulationintoasetofagentswhotendtoperformandhaveshortertunes,andasetofagentswhotendtolistenandstore.Theaimisforlowerinteractioncoefficientagentstobefocusedonprovidinglowerelements–i.e.shorterelements–ofthemusicalhierarchy.

AnExampleCycle

6.4 Anexamplecyclewillnowbeshown.Inthisexampleathreeagentsystemisexamined.Agent1istheperformerandstartsbyconsideringperformingtoAgent2;Agent1'smeasureofAgent2'sinteractioncoefficientisverylowinthisexample;Agent1'smeasureofAgent3'sinteractioncoefficientisveryhigh;Agent1'saffectivestateishighvalenceandhigharousal–i.e.happy;andAgent3'saffectivestateislowvalenceandlowarousal–i.e.sad.

6.5 Asthecyclingstarts,becauseAgent1'sinteractioncoefficientofAgent2isverylow,Agent1doesnotevenperformtoAgent2.ItselectsthenextAgentiteratively.Agent3isselectedbecauseagentsareorderedbynumericallabel.Agent1'sviewofAgent3'sinteractioncoefficientisveryhigh–soAgent1performsitstuneT1adjustingittomakeithappierbecauseofitshighvalenceandarousalstate,givingaperformanceP1.

6.6 Agent3estimatestheaffectivecontentofAgent1'sperformanceP1andgetsaresultofhighvalenceandarousal–i.e.itestimatesitisahappyperformance.BecauseAgent3'saffectiveestimateofAgent1'stuneishighvalenceandarousalbutAgent3'sstateislowvalenceandarousal–i.e.verydifferenttohappy–Agent3discardsAgent1'stune.HoweverAgent3stilladjustsitownsaffectivestatetowardsitsestimateoftheaffectivecontentofperformanceP1i.e.itbecomesalittlemorehappy.NeitherAgentmakesanyadjustmenttotheirinteractioncoefficientmeasuressincenoperformanceswerestored.NextAgent2becomestheperformer,andthefirstagentisiterativelychosentolisten–i.e.Agent1.

Experiments

7.1 Theissueofhowtoevaluateanalgorithmiccompositionisbynomeansagreedintheresearchcommunity.Parametric/Example-basedinvestigationsarecommonininvestigatingalgorithmiccompositionsystems(e.g.,Beyls2007;Fonseka2000;Anders2007).SuchexperimentsweredoneanalysinghowMASCoutputrespondedtovariousparameterchanges,givingobjectiveinformationonMASC'sbehaviourforapotentialuser.Suchexperimentsareimportantbecausetheyprovideinsightintothedynamicsofthesystem.Itshouldbenotedthatinthissystemthereareanumberofparameterswhichneedtobeset;itisbeyondthescopeofthisarticletodescribeallofthem.Thosenotmentionedexplicitlyhereweresettodefaultvalues.

MelodyGeneration

7.2 Ahelpfulwaytoindicatethatthissystemcanproducenon-trivialmelodies,inspiteofitslackofmelodicintelligence,istoexploretheoutputspaceforsomedifferentinitialaffectivestates.UsuallyinaMASonewouldwanttoperformastatisticalanalysisofbehaviour,butitisanunsolvedprobleminalgorithmiccompositionastowhatstatisticsaremusicallyrelevant(Freitasetal.2012).Soinsteadspecificmusicalresultsarepresented.Figures4to8showagent6'sfinaltune,froman8agentsystemrunfor10cycles.Thesimilaritythresholdwas1,theinteractioncoefficientsystemwasswitchedoff,andvalenceandarousalupdaterates-gammainequations(7)and(8)–were0.001.AgentswereinitialisedeachwiththesinglenoteofmiddleC,ofduration0.5seconds.Fourinitializingstateswereusedwithdifferentlevelsfor[Valence,Arousal]pairs.Thesewere:Happy=[0.5,0.5];Sad=[−0.5,−0.5];Angry=[−0.5,0.5];Tender=[0.5;−0.5].Thefirstlettersofeachstatewereusedtoindicatetheagentinitialstates.ForexampleTTTTTTHHis6Tenderand2Happy.

Figure4.8AgentsAAAAAAAA,10Cycles,and8AgentsAAAAAASS,10Cycles

http://jasss.soc.surrey.ac.uk/18/2/16.html 8 20/10/2015

Page 9: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

Figure5.8AgentsSSSSSSSS,10Cycles,and8AgentsSSSSSSAA,10Cycles

Figure6.8AgentsSSSSSSHH,10Cycles,and8AgentsSSSSSSTT,10Cycles

Figure7.8AgentsHHHHHHSS,10Cycles,and8AgentsHHHHHHTT,10Cycles

http://jasss.soc.surrey.ac.uk/18/2/16.html 9 20/10/2015

Page 10: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

Figure8.8AgentsTTTTTTSS,10Cycles,and8AgentsTTTTTTHH,10Cycles

7.3 Eachverticaltickisasemi-tone.Precisepitchvalueshavebeenremovedfromthegraphstoaidreadability,andbecauseabsolutepitchvaluesarenotthekeyelementhere,buttherelativeinterplayofstructureswhichitwillnowbearguedindicatenon-triviality.Toclarifywhythisbroadrangeoftunesisconsiderednon-trivialasmelodies,anumberofelementsarenowhighlighted.Musichasbeengenerateduptoandover50secondslong.Ifthetuneswereonly3to6notesorafewsecondslong,thesewouldbetrivial.Themelodiesarenotjustsimpledirectionalpitchpatterns,likesinglerepeatednotes,uniformlyrisingorfallingpatterns,orrepeated"zig-zags".Themelodiesarenotjustgroupsofrepeatednotes,e.g.5notesatonepitch,then4notesatanother,etc.Thepitchesvarymuchmorethanthat.Howevertheydonotvaryallthetime–therearetimeswhennotesarerepeated2or3times,asonewouldexpectinmelodies.Timingrepetitionisnottoohighortoolow.Melodieshavebeengeneratedwherethetempodoesnotstayconstant,butalsothetempodoesnotsimplyseemtovaryrandomly.Itwouldseemoddifnoteonlyraisedorloweredbyonepitchinalltunes.Inmusictherearesometimeslargerjumps.Howeveritwouldalsoseemoddifallthepitchchangeswerelarge.Ithasbeenshownthattunescanbegeneratedwhichavoidthesetwoextremes.Finally,themelodiescontainrecognizablenotegroupingswhicharerepeatedandtransformedtodifferentpitchesandtimings.Thisisexpectedbywesternlistenerswhousuallylooktoidentifyamotifstructure.

7.4 Ideallyitwouldbedesiredtohavesomemorescientificmeasureofnon-trivialityforthemelodies,howeverthereisnosuchagreedmeasure.Therehavebeensomeattemptstousemeasureslikeentropybutnoconclusiveresultshavebeenobtained(Kirke&Miranda2007).Anotherapproachmightbetousemeasuresofcomplexity,asitwasstatedatthestartofthispaperthatsuchcomplexitywasamajormotivationinusingartificiallifesystemsforcreativity.Howeverthereisnoconclusiveworkonartisticallyinformedmeasuresofcomplexity.Soalthoughtheaboveapproachisnon-scientific,itisfairlydetailedanddoescapturemanyelementswhichacomposerwouldconsidertobekeytonon-trivialityofmusic.

7.5 Insummarythefollowingexampleofafullcompositioncanbeheardhttp://cmr.soc.plymouth.ac.uk/alexiskirke/mapc.html

7.6 Thiscompositionhasbeenputthroughacomputersystemforexpressiveperformancetomakeitmorelistenablethroughasequencer(Kirke&Miranda2009).

7.7 Itwasstatedatthebeginningofthispaperthatanaimwasthatthenon-trivialmelodypitchstructureswouldbedevelopedbyagentswithoutexplicitmelodicknowledge.Givenwehavenowclaimedtheproductionofnon-trivialmelodystructures,wewillexaminetheclaimofnoexplicitmelodicknowledge.Thisdoesnotmeanheagentshavenomusicalknowledge.Infacttheagentshaveasignificantamountofhand-craftedmusicalknowledgeinthemusicaltransformationandaffectiveestimationformulae.However,thisknowledgedoesnotincludehowtosequentiallyconstructmelodies.Itincludeshowtochangekeymodes,timingsandpitches,butnomusicalrulesaboutwhichmusicalfeaturesshouldfollowwhich.Thebasisofmelodicstructureiswhichpitchesfollowandwhichnotetimingsfollowwhich.Thisisasignificantgapintheagents'explicitknowledgeaboutmusicwhichcanbereducedbycarefulhand-craftingoftherules,butnotremovedwithoutincludingorderingconstraintsinthemusicalequations.Thustheorderingofnotesinthissystememergesasaresultofthesocialinteractionsbetweentheagents.

7.8 Itwouldbeinformativetoseehowsimplethemusicalrulescouldhavebeenmadebeforethetunesfailedtobenon-trivial(withnon-trivialityformulatedbasedontheprocessdescribedearlierinthissection).Thismightbringthewholesystemcloserinphilosophytotheflockingsimulationsdiscussed.Howeverthisisbeyondthescopeofthiscurrentwork.

AgentAffectiveTuneandStateAdaptation

7.9 IthelpstounderstandtheMASCdynamicsmoreclearlybylookinginamoregeneralwayathowmusicfeaturesareeffectedbyagentinitialaffectivestates,aswellashowagentaffectivestatesarechangedasisdoneinFigures9and10.Notethattheseexperimentsinvolvedswitchingoffthesimilaritythresholdaswell,soastofocuspurelyonaffectiveinteractiondynamics.Thesefiguresexaminetheresultmusicfeaturesafter10cyclesofinterationofMASC,forthesystemof8agentsabove,aswellasforasmallersystemof2agents.ThepitchspreadisthedistancebetweenthehighestandlowestMIDIpitchinthefinaltuneaveragedacrossallagents,similarlywiththemedianpitch.Theaveragekeyisfoundusingakeyprofilingalgorithm(Krumhansl&Kessler1982).Figures9and10alsohavearrowstohighlighttheprogressionoffeaturesasinitialaffectivestatesarechanged.Theyfurthermorehavedashedellipsestohighlighthowclosetogethertheresultingfeaturesofthe2agentsystemaretotheir"equivalent"8agentsystem.

7.10 AkeyelementofMASCwhichwouldbenewformanycomposersutilizingit,wouldbetheassigningofinitialaffectivestatestotheindiviualagents.ThusthemoreintutitvetheresultsofsuchassignmentsthemoreusefulMASCcouldbe.Thismayatfirstseemcontradictorytowhatwassaidinitiallycautioningthereaderearlierontheirinterpretationoftheourusesofthewords"Happy"and"Sad"inrelationtoagents.Howeverthiscautionwastopreventintepretationthattheagentsthemselvesweresomehow"Happy"or"Sad".Theuseoflabelssuchastheseasashorthandforarousalandvalenceconfigurationsisstillvalid,andhelpfultothecomposerusingthissystem.Figures9and10provideinsightintotheeffectsofthisprocess.Itcanbeseenthatbroadlythereareunderstandablepatterns/trajectories,intermsofmusicfeatureresults,whentheinitialaffectivestatesarechanged.Forexampleincreasingthenumberofhappyagentsincreasesmedianpitchandreducespitchspread.

http://jasss.soc.surrey.ac.uk/18/2/16.html 10 20/10/2015

Page 11: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

Figure9.EffectsofInitialAffectiveStatesin8AgentSystemonaverageIOIandKey,8CyclesRun

7.11 ThetunesinFigures8and9generatedbythe8agentsystem,togetherwithsixmoreinvolvingfurthercombinationsofH,S,AandT,wereplayedto10listeners.Itwasfoundthatwhenatleast6ofthe8agentshadthesameinitialvalenceandarousal,thenthelistenershada71%chanceofdetectingthatsamevalenceinthefinaltune,andan82%chanceofdetectingthatsamearousalinthetune.Althoughthesmallnumberofparticipantsmeansthattheresultsarenotverysignificant,theyarecertainlyindicativethat–giventhesupportoftheremainderoftheparametricevaluation–itisworthdoingasubstantialsetoflisteningteststoevaluatethispotentialaffectivepattern.ThiswouldfurtherhighlighttheabilityofMASCtobeusedinarelativelyintutivewayasacomputer-aidedcompositiontool.

Figure10.EffectsofInitialAffectiveStatesin8AgentSystemonPitchMedian/Spread,8CyclesRun

InteractionCoefficientandMusicalHierarchy

7.12 Ashasbeenmentioned,theinteractioncoefficientprocessisdesignedasanattempttoregulatetunegrowthinsuchawaythatcertainagentswillbecometuneprovideragentsandsomewillbecometunereceiveragents,thuscreatingahierarchyintheagent'sinteractionstructurewhichis

http://jasss.soc.surrey.ac.uk/18/2/16.html 11 20/10/2015

Page 12: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

hopefullyreflectedinthehierarchalstructureofthemelodies.Inthisexperiment,an8agentsystemwasusedwithequallyspreadagentinitialaffectivestates,300notememorysize,32cycles,affectivesimilaritythresholdof1,pitchupdaterateof0.1,andIOIandloudnessupdateratesof0.5;valenceandarousalupdateratesweresetto0.1.Theinteractioncoefficientupdateratewassetto0.2;andtheinteractioncoefficientthresholdwassetto0.9.Figure11showstheevolutionofanagent'sinteractioncoefficientaveragedacrossallotheragents.Sothetopgraphshowstheaverageview/trustthatagents2to8haveofagent1byaveragingtheircoefficientvaluesforagent1.

7.13 After32cyclesthenumberofnotesthatagents1to8haveisrespectively:291,102,102,102,102,102,18,and5.ThesetunescanbeseeninFigure12.LookingatFigure11,itcanbeseenthatthisrelatestotheinteractioncoefficient–thehigheranagent'sfinalinteractioncoefficientthehigheritsnotecount.Table3showsineachcyclewhichagentsanagentreceivestunesfrom.Soforexampleincycle1,agents2to6receivetunesfromagent1.Incycle2agent1receivesatunefromagent2,butnootheragentsreceivetunes.

7.14 InTable3itcanbeseenthatthelowerinteractioncoefficientagentstendtogiveouttunes,whilethehigherinteractioncoefficientagentstendtoreceivetunes.Thelowernumberedagentshavehigherinteractioncoefficientbecauseoftheorderingofagentinteractionineachcycle.Thelowernumberedagentswillbereceiversfirst,andhaveachancetobuildupthesizeoftheirtunes.Thenwhentheybecomeperformers–givers–theloweragentswillreceivelargetunesfromthemandtheirinteractioncoefficientwillincreaseasaresult.

7.15 Toseehowthiscreatesthehierarchicalstructure,considerthatbyTable3Agent1'sfinaltunecouldbewrittenasaconcatenationofsub-tunes1021324354657629310411512613217318whereeachnumberindicatestheagentwhoperformed,andthesubscriptsarethecyclenumbers;anagent'stunevariesoverdifferentcycles–e.g.318isnotthesame32.BecausetheMASisaclosedsystem,alltunesinthisstructurearetheresultofatransformationonanotheragent'stune.

7.16 Soforexample21=2010'and32=3010''and76=7010'''.HeretheprimesrepresenttransformationsonAgent1'stuneduetoAgent1'saffectivestateatthetime.InthenextroundoftunesbeinggiventoAgent1thisgives29=2177'188'=(2010')77'188'

7.17 ThisexpansioncanbecontinueduntilthereisafulldescriptionofAgent1'stunesbasedonthewayinwhichthetunegrows.Thisdescriptionwillshowthebuildingstructureofthetune.Itwillnotnecessarilyshowtheperceptualstructureofthetune–thisisnotclaimed,butitwillshowhowthetunewasbuiltfromthemotifsandphrasesandsoforthofotheragents.Thisstructureisclearlyafunctionoftheagentinteractionhierarchy,andashasbeenseenthishierarchyisstronglyinfluencedbytheInteractionCoefficientfunctionality.

7.18 Thesediagramshighlightacharacteristicofthesystem–itssequentialupdaterule.Becauseagentsarealwaysupdatedinthesameorder,theagentswithalowerindexnumbertendtodevelopmuchlargertunesandhaveahighinteractioncoefficient.Anasynchronoussystemsimulationcouldhavebeenutilizedtoavoidthis–wherethenextagenttointeractisrandomlyselected.Howeverthiswouldhavemovedawayfromtheprocessofalgorithmic–i.e.non-random–compositionprocesseswhichthisworkwasdesignedtobuildupon.

http://jasss.soc.surrey.ac.uk/18/2/16.html 12 20/10/2015

Page 13: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

Figure11.ChangeinmeanInteractionCoefficient(x-axis=numberofinteractions)

http://jasss.soc.surrey.ac.uk/18/2/16.html 13 20/10/2015

Page 14: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

Figure12.Tunesafter32cycles

Table3:PatternofInteraction

ConclusionsandFutureWork

8.1 Amulti-agentsystemMASChasbeenpresentedwhichisaproofofconceptthatamulti-agentsystemcandevelopnon-trivialmelodypitchstructuresthroughaffectiveinteractionofagentswithoutexplicitmelodicknowledge.Agentshavenoexplicitknowledgeofwhichnotesshouldfollowwhich,howtheyshouldrepeat,andsoforth.Anagent'sonlycompositionalknowledgeofmusicisitsabilitytoextractaffectivedatafromthewholeandimposeaffectivefeaturesonthewhole.MASCalsodemonstratesthatmulti-agentsocialstructurescangeneratemusicalstructureonthematicandsectionallevelsaswellasonanoteorphraselevel.Thereweretwodemonstrations.Itwasdemonstrateddiagrammaticallyhowtheinteractionstructurewouldrelatetothemusicalstructure.Thenanexamplewasgivenshowingthemusicalstructurebuildingupandhowitrelatedtotheagents'socialstructure.AfinalcontributionofMASCisthelinearmusic-emotionanalyzingmodelwhichtakesasinputamonophonicMIDIfileandestimatesitsaffectivecontent.

8.2 Intermsoffuturework,thelisteningtestsperformedwerefairlybasic,usingasmallnumberofsubjectsandthusonlyindicative.MoreextensivetestsareneededtosupporttheabilityofMASCtobeusedinarelativelyintutivewayasanaffectivecomputer-aidedcompositiontool.SuchtestscouldalsobeusedtoexaminethedifferencebetweenthegenerativestructureinMASCtunes,andtheperceivedstructureforhumanlisteners.Thiswouldhelptoclarifytheeffectivenessoftheinteractioncoefficientapproachtohierarchicalstructuregeneration.

8.3 OnthesubjectofInteractionCoefficient,thereareothercontextsthatcouldbeinvestigatedtoinfluencefutureagentinteractions,besidestheirpastinteractionslists.Forexampleanagentcouldhavetimevaryingtrajectoriessetbythecomposer,whichcouldbiaselementsliketheirarousalandvalence,ortheextenttowhichtheiraffectivestatetransformsthemusictheyperform.ThiswouldprovideadditionalcontextsforthecomposertouseincontrollingtheMASoutput.

8.4 Anotherelementoffutureworkistheadditionofindeterminacy.Itwasdesiredtoexaminethemulti-agentsystemasaformofalgorithmiccomposition–hencekeepingthewholesystemdeterministic.Forexampleinpreviousworktheauthorshaveutilizedsemi-randomcommunicationerrorsbetweenagents,contributingtochangesintheirtunesandtransformations(Kirke&Miranda2011).Ashasalreadybeenmentioned,theuseofindeterminacywouldalsoallowforthesimulationofasynchronouscommunication–sothatitisnotalwaysthesameagentswhobeginthesingingcycle.

Acknowledgements

http://jasss.soc.surrey.ac.uk/18/2/16.html 14 20/10/2015

Page 15: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

ThisworkwaspartiallyfinanciallysupportedbytheEPSRCfundedproject"LearningtheStructureofMusic,"grantEPD063612-1.

References

ANDERS,T.(2007).ComposingMusicbyComposingRules:DesignandUsageofaGenericMusicConstraintSystem.Thesis/Dissertation,QueensUniversity,Belfast.

BEYLS,P.(2007).InteractionandSelf-organisationinaSocietyofMusicalAgents.ProceedingsofECAL2007WorkshoponMusicandArtificialLife(MusicAL2007),Lisbon,Portugal.

BLACKWELL,T.&Bentley,P.(2002).ImprovisedMusicwithSwarms.Proceedingsofthe2002WorldCongressonEvolutionaryComputation,Honolulu,Hawaii.[doi:10.1109/CEC.2002.1004458]

BISIG,D.&Neukom,N.(2008).InteractiveSwarmOrchestra,anArtificialLifeApproachtoComputerMusic.Proceedingsofthe11thGenerativeArtConference,Milan,Italy.

BROWN,A.(2002).OpportunitiesforEvolutionaryMusicComposition.Proceedingsof2002ACMC,Melbourne,Australia.

CANAZZA,S.,DePoli,G.,Drioli,C.,Rodà,A.,&Vidolin,A.(2004).Modelingandcontrolofexpressivenessinmusicperformance.TheProceedingsoftheIEEE92,686–701.[doi:10.1109/JPROC.2004.825889]

CHONG,H.,Jeong,E.&Kim,S.(2013).Listeners'PerceptionofIntendedEmotionsinMusic.InternationalJournalofContents9(4),78–85.[doi:10.5392/IJoC.2013.9.4.078]

CLAIR,R.,Moncharce,N.&Slimane,M.(2008).Interactionsbetweenanartificialcolonyofmusicalantsandanimpairedhumancomposer:towardsaccessiblegenerativearts.Proceedingsofthe11thGenerativeArtConferenceMilan,Italy.

DAHLSTEDT,P.&McBurney,P.(2006).Musicalagents:TowardComputer-AidedMusicCompositionUsingAutonomousSoftwareAgents.Leonardo39,469–470[doi:10.1162/leon.2006.39.5.469]

DIXON,S.(2010).AutomaticExtractionofTempoandBeatFromExpressivePerformances.JournalofNewMusicResearch30(1),39–58.[doi:10.1076/jnmr.30.1.39.7119]

EIGENFELDT,A.(2009).TheEvolutionofEvolutionarySoftware:IntelligentRhythmGenerationinKineticEngine.ApplicationsofEvolutionaryComputing.Springer,Heidelberg,498-507.[doi:10.1007/978-3-642-01129-0_56]

FONSEKA,J.(2000).MusicalAgents.Thesis/Dissertation,MonashUniversity

FREITAS,A.,Guimarães,F.,andBarbosa,R.(2012).IdeasinAutomaticEvaluationMethodsforMelodiesinAlgorithmicComposition.Proceedingsofthe2012SoundandMusicComputingConference,Copenhagen,Denmark.

GONG,T.,Zhang,Q.,&Wu,H.(2005).Musicevolutioninacomplexsystemofinteractingagents.Proceedingsofthe2005IEEECongressonEvolutionaryComputation,Edinburgh,UK.[doi:10.1109/CEC.2005.1554815]

JUSLIN,P.&Laukka,P.(2004).Expression,perception,andinductionofmusicalemotion:areviewandaquestionnairestudyofeverydaylistening.JournalofNewMusicResearch33,216–237.[doi:10.1080/0929821042000317813]

KIRKEA.&Miranda,E.R.(2007).EvaluatingMappingsforCellularAutomataMusic,InProceedingsofECAL2007WorkshoponMusicandArtificialLife,Lisbon,Portugal.

KIRKE,A.&Miranda,E.R.(2009).ASurveyofComputerSystemsforExpressiveMusicPerformance.ACMComputingSurveys.42(1).[doi:10.1145/1592451.1592454]

KIRKE,A.&Miranda,E.(2011).ABiophysicallyConstrainedMulti-agentSystemsApproachtoAlgorithmicCompositionwithExpressivePerformance.InCA-lifeforMusic,Wisconsin:A-REditions,Inc.,165–195.

KRUMHANSL,C.&Kessler,E.(1982).Tracingthedynamicchangesinperceivedtonalorganizationinaspatialrepresentationofmusicalkeys.PsychologicalReview,89,334–368.[doi:10.1037/0033-295X.89.4.334]

LANG,P.(1995).Theemotionprobe:Studiesofmotivationandattention.TheAmericanPsychologist,50,372–385.[doi:10.1037/0003-066X.50.5.372]

LEGASPI,R.,Hashimoto,Y.,Moriyama,K.,Kurihara,S.&Numao,M.(2007).MusicCompositionalIntelligencewithanAffectiveFlavor.ProceedingsofICIUII,USA.[doi:10.1145/1216295.1216335]

LIVINGSTONE,S.R.,Muhlberger,R.,Brown,A.R.&Thompson,W.(2010)."ChangingMusicalEmotion:AComputationalRuleSystemforModifyingScoreandPerformance."ComputerMusicJournal,13(41).[doi:10.1162/comj.2010.34.1.41]

MARTINS,J.M.&Miranda,E.R.(2007).EmergentRhythmicPhrasesinanA-LifeEnvironment.ProceedingsofECAL2007WorkshoponMusicandArtificialLife(MusicAL2007),Lisbon,Portugal.

MIRANDA,E.R.(2002)EmergentSoundRepertoiresinVirtualSocieties.ComputerMusicJournal,26(2),77–90.[doi:10.1162/014892602760137194]

MURRAY-RUST,A.&Smaill,A.(2005).MusicalActsandMusicalAgents.Proceedingsofthe5thOpenWorkshopofMUSICNETWORK.Vienna,Austria.

MURRAY-RUST,A.,Smaill,A.&Maya,M.C.(2005).VirtuaLatin-TowardsaMusicalMulti-AgentSystem.ProceedingsofSixthInternationalConferenceonComputationalIntelligenceandMultimediaApplications.LasVegas,NV,USA.

http://jasss.soc.surrey.ac.uk/18/2/16.html 15 20/10/2015

Page 16: A Multi-Agent Emotional Society Whose Melodies Represent its Emergent Social …jasss.soc.surrey.ac.uk/18/2/16/16.pdf · 2018. 2. 9. · affected under such social pressures. As a

OEHME,A.,Herbon,A.,Kupschick,S.&Zentsch,E.(2007).Physiologicalcorrelatesofemotions,inProceedingsofthe2007ConferenceonArtificialIntelligenceandSimulationofBehaviour.Newcastle,UK.

PANZARASA,P.&Jennings,N.(2006).Collectivecognitionandemergenceinmulti-agentsystems.InCognitionandMulti-AgentInteraction,Cambridge:CambridgeUniversityPress,401–408.

RAMCHURN,S.,Huyunh,D.,&Jennings,N.(2004).Trustinmulti-agentsystems.TheKnowledgeEngineeringReview,19,1–25.[doi:10.1017/S0269888904000116]

REYNOLDS,C.(1987).Flocks,herds,andschools:Adistributedbehavioralmodel.ComputerGraphics21(4),25–34.[doi:10.1145/37402.37406]

SAMPAIO,P.A.,Ramalho,G.,&Tedesco,P.(2008).CinBalada:amultiagentrhythmfactory.JournaloftheBrazilianComputerSociety,14,31–49.[doi:10.1007/BF03192563]

SPICER,M.,Tan,B.,&Tan,C.(2003).ALearningAgentBasedInteractivePerformanceSystem.Proceedingsofthe2003InternationalComputerMusicConference.SanFrancisco,USA.

UEDA,L.K,&Kon,E.(2003).Andante:AMobileMusicalAgentsInfrastructure.Proceedingsofthe9thBrazilianSymposiumonComputerMusic.Campinas,Brazil.

WULFHOST,D.,Flores,L.V.,Nakayama,L.,Flores,C.D.,Alvares,O.C.,&Vicari,R.M.(2003).AnOpenArchitectureforaMusicalMulti-AgentSystem.ProceedingsofthesecondinternationaljointconferenceonAutonomousagentsandmultiagentsystems.Melbourne,Australia.[doi:10.1145/860575.860669]

ZENTNER,M.,Grandjean,D.&Scherer,K.(2008).EmotionsEvokedbytheSoundofMusic:Characterization,Classification,andMeasurement.Emotion,8(4),494–521.[doi:10.1037/1528-3542.8.4.494]

http://jasss.soc.surrey.ac.uk/18/2/16.html 16 20/10/2015


Recommended