Date post: | 09-Apr-2018 |
Category: |
Documents |
Upload: | european-network-of-living-labs |
View: | 221 times |
Download: | 0 times |
of 38
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
1/38
DELIVERABLE
Project Acronym: APOLLON
Grant Agreement number: 250516
Project Title: Advanced Pilots of Living Labs Operating in Networks
Deliverable 1.3
Framework for APOLLON Evaluation and Impact Assessmentincluding KPI definition and measurement
Revision: Final Version
Authors:
Anna Sthlbrst (LTU)
Petra Turkama (Aalto University)
Bram Lievens (IBBT)
Hendrik Heilkama (Aalto University)
Petra Hochstein (SAP)
Christian Merz (SAP)
Claudio Vandi (UP8)
Project co-funded by the European Commission within the ICT Policy Support Programme
Dissemination Level
P Public X
C Confidential, only for members of the consortium and the Commission Services
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
2/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 2 Final Version
RevisionHistory
RevisionDate Author Organisation Description
V0.1 11.08.2010Sthlbrst LTU/CDT
V0.2 12.08.2010Turkama Aalto Additionstochapters2and5
V03 02.10.2010Sthlbrst LTU/CDT Change in methodology assessmentframework
V04 10.10.2010Sthlbrst LTU/CDT Changes in impact assessmentframework
V05 28.10.2010Sthlbrst LTU/CDT Finalisingthedeliverable
Statementoforiginality:
Thisdeliverablecontainsoriginalunpublishedworkexceptwhereclearlyindicatedotherwise.Acknowledgementofpreviouslypublishedmaterialandoftheworkofothershasbeenmadethroughappropriatecitation,quotationorboth.
Theinformationinthisdocumentisprovidedasisandnoguaranteeorwarranty
isgiventhattheinformationisfitforanyparticularpurpose.Theuserthereofusestheinformationatitssoleriskandliability.
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
3/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 3 Final Version
TableofContents
1. Summary.......................................................................................................................................4
2. Introduction.................................................................................................................................4
2.1 ObjectiveandAim................................................................................................................................. 52.2 TheoreticalFrameworkforEvaluation......................................................................................... 6
2.2.2 EvaluationApproaches..................................................................................................................................62.3 EvaluatingMethodology..................................................................................................................... 8
2.3.2 Fourcommondeficienciesinmethodologiesare:......................................................... .....................82.4 Key-PerformanceIndicators............................................................................................................. 9
3. APOLLONCriterionfortheEvaluationFramework.....................................................103.1 SummaryoftheBase-LineInvestigationamongAPOLLONPartners...............................10
3.2 SummaryofRequirementsfromAPOLLONDeliverablex.1................................................11
4. APOLLONEvaluationProcesses..........................................................................................12
4.1APOLLONMethodologyEvaluationProcess..............................................................................12
4.1.2 MethodologyEvaluationProcess...........................................................................................................14 4.2 EvaluationoftheCross-borderNetworkingProcesswithintheThematic
Experiments.....................................................................................................................................................14
5. ResearchFrameworkforEvaluationofCross-BorderNetworkingProcess
withintheThematicExperiments.................................................................................................145.1 EvaluationProcessoftheThematicExperiments...................................................................16
6. EvaluationTemplatefortheAPOLLONCross-BorderMethodology.......................176.1 APOLLONmethodologyEvaluationFrameworkTemplate..................................................17
7. TemplateforEvaluatingCross-borderNetworkingExperiments...........................23
8. TemplateforAPOLLONExperimentSpecificKPI:s.......................................................36
References.............................................................................................................................................38
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
4/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 4 Final Version
1.SummaryThe aim of this deliverable is to provide an evaluation and impact assessment
frameworkthatwillallowtoassesstheAPOLLONmethodology&toolsetaswellastoidentify the added value of cross border Living Lab networking with specified key-performanceindicators.Inthisdeliverable,apresentationofthetheoreticalframeworkthathasguidedthedevelopmentoftheevaluationframeworkisgiven.Inadditiontheinvestigation that was performed among APOLLON partners in the beginning of theproject is presented. This investigation served to identify relevant measures ofperformance among the involved project partners. We have also used the D x.1deliverablesfromtheotherwork-packagesasameanstoidentifyrelevantperformanceindicatorsoftheexperimentsinthethematicareas.Thesesourcesofinformationthenformed the basis of this deliverable together with the theoretical framework. This
frameworkhasthenbeenusedasabasisforthedesignoftheevaluationprocessaswellasthe evaluationframework.TheprocessofevaluatingtheAPOLLONmethodology isdescribedwheretheliaisonpersonfromWP1collaborativelyanditerativelyevaluatesthedifferentstagesofthemethodology.Theprocessofevaluatingtheexperiment,whichis carriedout in the different work-packages, is based on self-assessmentwhere theleadersoftaskx.4ineachwork-packageapplytheframeworkandmakethenecessaryadjustmentsfortheircontextintotheframework.Thisreportalsocontainstheresearchframeworkwhichisappliedintotheexperimentsinthework-packagestohelpthemdesignandassesstheirexperimentsinaconsideredandresearchablemanner.Finally,this deliverable contains two different evaluation framework templates, first theframeworkforevaluatingtheAPOLLONmethodologies eachphases,and secondlythe
frameworkforevaluatingtheaddedvalueand impactof theexperimentsforrelevantstakeholders.
2.IntroductionTheevaluationandimpactassessmentframeworkdevelopedintheAPOLLONproject,aimstomonitor,analyseandassesstheAPOLLONmethodologyaswellastheaddedvalueofcrossborderLivingLabnetworking.Theaimofthisdeliverableistoprovideanevaluationand impact assessment framework thatwill allow toassess the APOLLONmethodology&toolssetaswellastoidentifytheaddedvalueofcross-borderLivingLab
networking. In this frameworkkey performance indicators are definedwhichwill bemeasuredintheexperimentsinWP2,3,4&5 intaskx.4.Thisevaluationframeworkwillthereforeassesstwodifferentprocesses,(1)theAPOLLONmethodologysupportingthecrossbordernetworking,and(2)theaddedvalueofthecrossborderLivingLabnetworking.
The developed APOLLON methodology will provide a framework for engaging,empowering and mobilizing self-organizing individuals within actor networks. Theproposed cross-industry infrastructure provides new opportunities and insights forindividuals as relationships between the organization and its members, and amongactorswithinandacrossorganizations.Theindividualstepsofthemethodologywillbe
continuously evaluated in three month intervals during the project in closecollaborationwiththeotherwork-packages.
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
5/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 5 Final Version
Thecross-bordernetworkingprocesswillfocusontechnologyandknowledgetransferactivitieswhichwillbeevaluatedinaformativemannerduringtheprojectslifecycle.Inthebeginningoftheproject,differentwork-packagesidentifyrelevantkey-performanceindicators andmeasures of their thematic experiment. This inputwill then form the
basisoftheevaluationframeworkoftheseexperiments.Duringtheproject,theongoingexperiments in thethematic domainswill beassessed andtheevaluation frameworkwill be adjusted accordingly to ensure its relevance and usage in the project. Theevaluationactivitieswillbecarriedoutcontinuouslywiththeaimtointerpretthecross-bordernetworking activities from different perspectives. In this process, the appliedmethodology supporting cross-border networking will be evaluated accordingly. Tosupport these assessment activities within each work package, the frameworkdevelopedwithinthisdeliverablewillbeapplied.Thisframeworkshouldbeviewedasawork-in-progresswhereexperiencesfromtheformativeevaluationsareincorporatedintoanupdatedversionoftheframeworktoensurethattheframeworkisusefultotheprojectsactivitiesbothintheverticaldomainsandWP1.
2.1 Objective and Aim
Theobjectiveof thisdeliverable istwo-folded:one isto evaluatetheAPOLLONcross-bordernetworkingmethodologyand theother isto identifytheaddedvalueofLivingLabcross-bordernetworkingexperiments.
Thatis.
1. Tomonitor,analyseandassesstheverticalexperimentsinrelationtothegeneralobjectivesandtheoverallAPOLLONmethodology
2. Tomonitor,analyseandassessthecross-bordernetworkingimpactonits
relevantstakeholders
The cross-border networking process in the different experimental domains will besupported and assessed by means of the developed cross-border networkingmethodology.ThisassessmentwillbesupportedbycontinuouslyinteractionbetweenWP1andtheverticaldomainsregardingtheimplementednetworkingmethodology.
The specificobjectives of the evaluation and assessment activity can be statedasfollows:
Observe and understand the progress and impact of the APOLLONmethodology among its stakeholders and to understand the determining
factors,challengesandprocesses Observe and interpret the process of the vertical domains activities to be
implementedintheAPOLLONmethodology
Evaluate the different patterns within the vertical domains and how thesecontributetothecreationofanvalidatedandcross-bordermethodology
Assess the added value of cross-border networking among the relevantstakeholderssuchasLivingLabs,SMEs,localauthorities,end-usersandlargeenterprises
Thiswillbeperformedbytheliaisonpersons(explainedinmoredetailinsection5)
from WP1 who will have close collaboration with the vertical experiments. Thesepersons will facilitate the usage of the methodology by explaining and suggesting
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
6/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 6 Final Version
suitable tools or templates to be used from the methodology in accordance to thevertical experiments phase. For example, if an experiment is planning to writeagreements,the liaisonpersonshould inform theperson responsiblefor the thematicdomain experiment that they can find support for this inthemethodology checklists.
Therelevanceandusageofthechecklistwillthenbeassessedinlaterinteractions.Tobeabletoformativelyassesstheseprocessesandtheirimpact,WP1strivetobuildour understanding of the cross-border Living Lab networking in accordance to thefollowingoverarchingthemes:
Interactionprocess(betweenthestakeholdersindifferentcountries)
Stakeholder needs (which needs are related to cross-border networkingamongitsstakeholder)
Activities(whichkeyactivitiesthatareperformedduringthenetworking)
Tools(whatkindoftoolsareneededandusedtosupporttheprocess) Resultsandeffectsofthecross-bordercollaboration(whathappenedduring
theprocessandwhatwastheimpactofit)
Critical Success Factors (factors ensuring the sustainability of thecollaboration)
2.2 Theoretical Framework for Evaluation
Toreallygraspwhytheassessmentframeworkisdesignedinacertainmannerandtobeabletoapplyitinausefulway,itisimportantunderstandthebasicsofevaluations.
Evaluation is a process aiming to investigate the significance, value, or quality ofsomething, based on a careful study of both its good and bad features. In manyeverydaysituations,weallmakejudgmentsaboutdifferentthings,actions,andeventshappening aroundus, without reflecting overwhether it shouldbe calledevaluation.Usually, evaluations are related to somethingbeing valued in a systematic andwell-consideredmanner.Hence,evaluationsbecomearationalprocess,wheremethodsarefollowedasameanstogaincontroloverthedifferentstepsintheevaluationprocess.Themainaimofanevaluationistoexpressavaluejudgementaboutthethingbeingevaluatedandtheevaluationshouldcriticallyscrutinizetheparticularobject.Thus,themissionisnottoonlydescribe,mapout,ormeasureanattitude.Instead,theendeavourshouldbetogaindeeperinsightsandtoquestionwhatistakenforgranted(Lundahl&quist,2002).Anevaluationmethodologymustbechosethatreflectstheviewsofallinvolvedstakeholder.
2.2.2 EvaluationApproaches
Evaluations are performed in a number of areas, such as, evaluations of educationalprograms,organizationalchanges,projectperformanceorevaluationoftechnology,andthere are at least three different evaluation approaches; (1) objective and resultevaluation,(2)processevaluation,and(3)interactiveevaluation.
The first approach, objective and result evaluation, dominated in the 1950s and the1960s. In these evaluations, the evaluator measured and described the results inquantitative terms and it was not the evaluators job to value differences (Guba &Lincoln,1989;Karlsson,1999).
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
7/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 7 Final Version
The second approach, process evaluation, was common between the 1970s and the1980s.Withinthisapproach,theinterestwasaimedatissuesabouthowtheresultshavebeenreached.Theevaluationswerenotonlyfocusedondescribingsomething,theexpectationwastodoaqualitativejudgmentaboutthethingbeingevaluated.
Thethirdapproach,interactiveevaluations,developedduringthe1980sandthe1990sandhaditsfocusonparticipationamongthoseinfluencedbytheevaluation.Thebasicthinkingwithinthisapproachwasthatparticipationbydifferentstakeholdersincreasedthe relevance of the evaluation questions and results; therefore, the stakeholdersinfluence was strengthened. In this project, we apply the interactive evaluationapproachbecausethismatchestheprocesswehavedesignedaswellasourperspectiveonevaluations.
Evaluationscanbeusedinmanydifferentways:asinstrumental,wheretheevaluationresults are use to influence peoplesmind-setsoractions, as long-term orshort-termeffectsfromtheevaluation,asguidanceforchoices,andsoforth(Karlsson1999).Eitherway,theevaluationshouldelucidatewholenessandrelations,andnotfocusonisolatedissues,anditistheevaluatorsdutytomakesurethatdifferentinterestsarerepresentedinareasonableandbalancedmanner(Lundahl&quist,2002).
Generally, inany evaluation, itis important todeterminewhen theevaluationwillbecarried out and why, meaning, to understand if it is a formative or a summativeevaluation. A formative evaluation is performed with the intention to change, orimprove, something, such as a project (Benyon et al., 2005; Karlsson, 1999; Lewis,2001). A formative approach to evaluation requires communication betweenstakeholdersandtheevaluator,becauseofitsgoaltochangesomethingandwithanaimtoidentifylearningpossibilitiesfromthesituation.Incontrast,asummativeevaluation
iscarriedoutinordertodeterminetheimpactoftheevaluand(Benyonetal.,2005;Karlsson,1999;Newman&Lamming,1995).Forexample,asummativeevaluationcouldbetostudytheimpactofaprojectsuchasAPOLLONintheendoftheproject.
Anyhow,whendecidingthewhenandwhyoftheevaluation,communicationbetweenstakeholdersand the evaluator is vital. Furthermore, thoseworking with evaluationsshould understand how things are related, and realize the fact that how things arerelated to each other are influenced by circumstances occurring in the evaluationscontext(Crdoba&Robson,2003).
In order to carry out an evaluation, it is important to know the purpose of the
evaluation.Thismightseemobvious,butisnotalwaysapparentwhenanevaluationisbeingplanned.Whenevaluationresearchersdiscussthequestionwhyanevaluationisperformed, they usually distinguish between aim, purpose, and function of theevaluation.Theaimofanevaluationistoproduceajudgmentthatestablishesthevalueoftheobjectbeingevaluated,thatistheevaluand,whichinthisprojectistheAPOLLONmethodology for cross-border networking and the cross-border networkingexperiments in the vertical domains. These judgements arise from the basis ofinterpretations,descriptions,andvaluationoftheevaluand.
Thepurposeofanevaluationistheintendedusageoftheevaluation.Apurposecanbetogettheopportunitytocontrolandjudgetheeffectivenessandqualityofanorganisation.Anotherpurposeofanevaluationcouldbetogainsupportfordecision-making,andathird example of a purpose could be to sustain decision-makers with arguments forprioritizing. Inthisproject, thepurposeoftheevaluationis tocontinuouslyrefinethe
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
8/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 8 Final Version
APOLLONmethodologyandtotransferknowledgefromoneexperiment tothe other.Thepurposeofanevaluationmightbeseparatedfromthe actualusageorfunctiontheevaluationhasinpractice.Thefunctiondoesnothavetobethesameasthedeclaredpurpose(Karlsson,1999).
Manytraditionalapproachestoevaluationshavefailedtorecognizethereactivenatureofevaluation.Justasperformancefactorsrewardsafe,short-termactivities,evaluationsbased on mean scores, instead of on the recognition of a few, but extraordinaryaccomplishments,workagainstinnovation,andthoseaimingtoexploretheunknown.Instead,theseapproachesrewardmediocrity.Failuresareusuallyviewedandtreatednegatively,withnegativeconsequencesforthosewhohavefailed,eveniftheattemptoftheinnovationwasveryambitious.Aprojectclaimingtobeinnovativeandhaveahighlevelofsuccessshouldbeviewedwithscepticism,becausethisprobablymeansthatwhatisbeingattemptedisnotveryambitious(Perrin,2002).
Hence,amethodologicalapproachtoevaluationofinnovationsshouldbeableto:
get at the exceptions, including unintended consequences, given that aquantitativeresearchapproachisnotrelevantandwillhidetrueachievements;
provideanunderstandingofthecomplexprocessesinvolved,aswellastohelpidentifylearningandimplicationsfromsuccessesandfailures;
be flexible enough to be open to chances and unexpected findings, which,especiallyregardinginnovations,canrepresentthekeyoutcomes(Perrin,2002).
For that kind ofquestions, qualitativemethods are usuallymost suitable, possibly incombinationwithotherapproaches(Patton,1987,1990).
2.3
Evaluating MethodologyOnepartofthisdeliverableistheframeworkfortheAPOLLONmethodologyevaluation.Methodologyisasimplesetofstatementsoraformalspecificationthatisappropriatefortheappliedcontextandculture,andclearlydocumentedandrigorouslyfollowed.
Users must be involved in the specification and in the design, development andimplementation of the methodology, and feel that the process is controllable andpredictable.IntheAPOLLONproject,weaimtocontinuouslyinvolveallwork-packagesintheprocessofdevelopingthemethodology.
Methodologymustintegrateallstakeholdersstrategicgoalswiththepracticalrealitiesof the available information technology and business environment. This means thatmethodology cannot be a static document. Instead, it must provide an adaptableframework forplanning, specifying, building, and implementing practical informationsystems.Westrivetoaccomplishthisbyusingtherequirementsfromx.1aswellasthebase-linequestionnaireasabasisforthedesignoftheAPOLLONmethodology.
2.3.2 Fourcommondeficienciesinmethodologiesare:
1. Lackofstructure:Thematerialissodisorganizedthatreaderscan'tfindwhat they're looking for. To facilitate the usage of the developedmethodologyintheAPOLLONprojectwewillstructurethemethodologyin accordance to the project plan and the ongoing activitieswithin the
project.
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
9/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 9 Final Version
2. Fragmentation:Thematerial the project participantsneeds is scatteredamong multiple manuals and other documents that have no clearrelationshiptooneanother.Fragmentationariseswhenanorganizationmakes a commitment to some new methodology component without
considering its impact on other, already established methodologycomponents, or when responsibilities for methodology support andsupport are split among different parts of an organization. In theAPOLLON project, this is handled by having one entrance point to themethodologyandbyhaving cleardescriptionstowhere the informationcanbeassessed.
3. Structural incompleteness: There is no natural or obvious place to putcertaininformationneededbytheprofessionalstaff.Consequently,someimportant information either never gets written down or is issued inseparatememosthataresoonforgotten.Structuralincompletenessoccurs
not only as a by-product of a lack of structure (1 above), but alsowheneverthetopicsinthetableofcontentsarebasedmoreontoday'sspecific tools and techniques than on relatively stable concepts. In theAPOLLON project this is handled by continuously validating themethodologyincollaborationbetweenWP1andtheotherwork-packages.The aim here is to re-design the methodology in accordance to thethematicdomainsneedsandrequirements.
4. Obsolescence: Most of the methodology material was developed yearsearlier and no longer reflects important aspects of the hardware, thesystem software, or the methods and tools in actual use. This is
particularlytrueofaprojectlikeAPOLLONinwhichtheMethodologyisatthesametimetheprerequisiteforcollaboration(establishedbyWP1)andone of the main results of the project (critical analysis of best casemethodologiesuseinverticalsandredactionoftheApollonMethodology)
Thesefourshortcomingsseverelyimpairtheusabilityofmethodologydocumentation,itsacceptancebytheusers,anditsvaluetothegoaltobeachieved.Hence,westrivetomeettheseshortcomingsinthebeginningofthedesignoftheAPOLLONmethodologytoensureitsusability.
2.4 Key-Performance Indicators
KeyPerformanceIndicatorsarequantifiablemeasuresthatmirrorcriticalsuccessfactorsinaproject.TheseKPI:sshouldbedecidedonbeforehandandtheygiveasnapshotviewofthestatusoftheproject.ItisthereforeimportanttorelatetheKPI:stotheprojectgoal.TheKPI:sassuchthereforefunctionasameasureoftheprogressoftheprojecttowardstheoverarchinggoals,notthefulfillmentofthegoalsassuch.IntheAPOLLONproject,thegoalistoshareandharmonizeLivingLabapproachesandplatformsbetweennetworksofexemplaryEuropeanLivingLabs.Hence,theKPI:softhisprojectfocusonmeasuringtheimpactofthecross-bordercollaborationexperimentsinrelationtomethods,approachesandtoolstoenlightenandmeasurethedegreeofgoalfulfillment.
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
10/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 10 Final Version
3.APOLLON Criterion for the Evaluation FrameworkInthissectionwewillpresentthedatathatformthebackgroundfortheevaluationandassessmentframework.Thisbackgrounddataincludesasummaryofdeliverablex.1,a
base-lineinvestigation,interviewswithSMEsandLivingLabsaswellasthedescriptionofwork.
Toensurethattheevaluationandassessmentframeworkbeingdevelopedwithinthistask is of value for the vertical domains and the methodology development, theframeworkisbuiltoncollecteddatasofar.Thismeansthatwehaveusedtheexpressedevaluationcriterionsfromthebase-lineinvestigation,deliverablex.1(Identificationofrequirements)fromthedifferentworkpackages,resultsfromthework-packagesuseoftheresearchframeworkintheirdescriptionoftheirexperiments,interviewswithSMEsandLivingLabsandinputfromtheprojectconsortiumasabasisforourframework.This material has been analysed with the aim to render evaluation and impact
assessmentcriterionstotheevaluationframework.
3.1 Summary of the Base-Line Investigation among APOLLON Partners
Thebaselineinvestigationwasawebbasedquestionnaireof43questionsdividedintofivemaincategories:General,Connect,SetBoundaries&Engage,Support&GovernandManage&Track,withsubcategories.Thequestionnairewaspre-testedbyoneofthepartners. After revision it was put online and an invitation was sent to the Apollonpartners.Thequestionnairewasansweredby16ofthepartners,6ofwhichareLivingLabs,7SMEsand3other.Thelivinglabsthatansweredthequestionnaireindicatedthattheymainlyworklocally,(50%)orintheirhomecountry(50%)andthatthepublic
and regionalauthoritiesareverysignificant for theiroperations.The living labshaveSMEsandlargecorporationsastheirimportant customers. TheSMEsareworkingonvarious fields, but software development stands out as one of the major areas ofdevelopment.
The SMEshave the self-declared task of understanding the customers better so thattheirproductsarebetterunderstood.FortheSMEsthemostimportantcontributionstothe networking activities within the projects they completed are the openness andknowledgetheyprovidetothenetwork.TheOpennessofworkingisalsomentionedasimportantinthelistofcompetenciesthatisessential(questions14to16).Themostsignificant support expected in termsofnetworking activities is in the guidance and
informationdeliveryforonerespondent,whiletheotherSMEsdidnotindicatespecificwishes.TheLivingLabsaremoredemandingintermsofmethodology.Theyrequest,indifferent words, a description of the different methods and guidance on when theyshouldbeused(questions17&18).Thisisreinforcedbytheanswerstoquestion25,whichalsoindicatestheneedforaunifiedapproach.
All respondents of the questionnaire indicate that they are well able to work withtechnicaltools.Thelevelofimportanceisdifferinglittlebetweenthedifferentgroupsandthesamplesizeistoosmalltofindstatisticalsignificantdifferences.Thetoolsthatareconsideredveryimportantareemail,projectportals,videoandphoneconferencing.Other tools, like chat software, groupware such as lotus notes, or wiki pages areimportanttoalesserextend.
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
11/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 11 Final Version
The use of IPR in projects is of considerable importance for the participants. Theanswersindicatethatthereisaneedforagreementsbeforetheprojectstarts.ThisneedisequallybigfortheSMEsasitisfortheLivingLabs.Whenlookingatexpectations,thereissomedifferencebetweenthelivinglabsandtheSMEs.Theformershowinterest
inthedevelopmentofcooperationandpartnershipforming,whilstthelatterfocusmoreon new ideas for business and markets opening up abroad (question 38) . Theseexpectations are inline with what would be expected and evaluation of the projectsshouldfocusontheseitems.
3.2 Summary of Requirements from APOLLON Deliverable x.1
Inorder to facilitate theusage and implementation of the evaluation framework, weaimedtolinkittotheongoingactivitieswithinthedifferentexperimentsintheWP:s.Inorder toaccomplish this,webasedourworkon the deliverable x.1 Identification ofRequirements.Thisisasummaryofthesedeliverables.Inthisdeliverable,thedifferent
work-packageshavepresentedalistofrequirementsforthedifferentexperimentstobetransferredbetweenthedifferentLivingLabs.Thisdeliverablehasbeenanalyzedandtheirrelevantrequirementsaresummarizedinthetablebelow.
Experiment Health Energy Emanufacturing eParticipation
Approach Common eco-systemmodel
A commonbenchmarkmodel
Common Technologyplatform
An Integrationframework
ResearchFocus Towhat extent isa trans-nationalinnovation systemable to stimulatethe adaptation ofinnovationssuccessful in onecountrytoanothercountry
Difficulties facedwith productintegration
Evaluate new ways ofcollaborating withpartners
What is needed forengaging users toparticipate, culturaldifferences
ResearchQuestion
How can wetransfer acontextualisedproject intoanother cross-borderprojectandwhat issues arerelatedtothat
Difficulties withuserscultureandtheirsurroundingenvironment
Userco-innovation Which cultural specificissues are problematicwhen extending moreinnovative applicationstoabroadercontexts
Method Compare use ofthe platform indifferent LLcontexts
Results on theimpact ofregulatoryenvironment,climate, cultureand behaviourcomparedbetween thedifferentLLs
NetworkingbetweenLLs Integration of differentsolutions impact onmarketfragmentation
ExpectedBenefits Marketopportunities for
Potentialofcrossborder
SMEs transnational Theapplicationsabilityto answer to
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
12/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 12 Final Version
SMEsbydoingthistransfer
collaboration intermsofcreatingsustainable andfeasiblesolutions for a
broadly definedchallenge
marketopportunities users/citizensneeds
Datacollection Monitoring,interviews,questionnaires
Study userbehaviourchange andmechanisms
Feedback on platformdeployment andintegrationservices
Lingual and culturalmisunderstandings
Resultcategories Successfulimplementation
Are results from onecountry coherent withresultsofothers
Finance
Userexperience Effectiveness of methodsused
Domain/areas
Connecting withlocalstakeholders
The interest of thepartners of using LLnetworks
Ways
Can a shared set of toolsand a commonmethodology extend thevalidityofnationaltests
Tools
Skills Enhancement
Efforts Support
Disseminationactivities
Table 1. Expressed evaluation criterions from the different thematic domains.Summaryofdeliverablex.1.
4.APOLLON Evaluation ProcessesAsmentionedbefore,theaimofdeliverableistoprovideevaluationframeworksbothfor the APOLLON methodology and the cross-border networking process within theexperiments in the thematic domains. Hence, in this section a presentation of theevaluationprocessforrespectiveevaluationsispresented.InaccordancetotheLivingLab approach as such, both processes build on user participation and a bottom upapproach. This means that the evaluations will be carried out in an interactive anditerativemanner.Subsequently,adescriptionofthesetwoevaluationprocessesisgiven
4.1 APOLLON Methodology Evaluation Process
In the APOLLON project, the cross-border networking process in the differentexperimental domains is supported by the developed APOLLON cross-bordernetworking methodology. This methodology is developed continuously during the
project, hence it different phases will be evaluated in an interactive and iterativemanner.ThisassessmentwillbesupportedbycontinuouslyinteractionsbetweenWP1
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
13/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 13 Final Version
and the vertical domains regarding the implemented methodology for cross-bordernetworking.
The specific objectives of the evaluation and assessment activity can be stated asfollows:
Observe and understand the progress and impacts of the APOLLONmethodology among its stakeholders and to understand the determiningfactors,challengesandprocesses
Observe and interpret the process of the vertical domains activities to beimplementedintheAPOLLONmethodology
Evaluate the different patterns within the vertical domains and how thesecontributetothecreationofanvalidatedandcross-bordermethodology
Assess the added value of cross-border networking among the relevant
stakeholder,LivingLabs,SMEsandlargeenterprisesThiswillbeperformedbytheliaisonpersons(explainedinmoredetailinsection6)fromWP1whohaveclosecollaborationwiththeverticalexperiments. Thesepersonswillfacilitatetheusageofthemethodologybyexplainingandsuggestingsuitablethingstobeusedfromthemethodologyinaccordancetotheverticalexperimentsphase.Forexample,ifanexperimentisplanningtowriteagreements,theliaisonpersonshouldinformtheverticalthattheycanfindsupportforthisinthemethodologychecklists.Therelevanceandusageofthechecklistwillthenbeassessedinlaterinteractions.
Tobeabletoformativelyassesstheseprocessesandtheirimpact,WP1strivetobuildour understanding of the cross-border Living Lab networking in accordance to the
followingoverarchingthemes:
Interaction process (between Living Labs, SMEs, Large enterprises indifferentcountries)
Stakeholderneeds(whichneedsthatarerelatedtocross-bordernetworkingamongtherelevantstakeholder)
Activities(whichkeyactivitiesthatareperformedduringthenetworking)
Tools(whatkindoftoolsareneededandusedtosupporttheprocess)
Resultsandeffectsofthecross-bordercollaboration(whathappenedduring
theprocessandwhatwastheimpactofit) Critical Success Factors (factors ensuring the sustainability of the
collaboration)
ThiscategoryisrelatedtoevaluatingthemethodologythatisbeingdevelopedintheAPOLLONproject.Theaimof thismethodology isto support theverticalexperimentscross-border networking activities. This category is related to identifying interactionprocesses,activitiesandtools.Thisevaluationwillgatherinformationconcerning:
Experienced benefits and challenges with cross-border activities from anoverarchingperspective
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
14/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 14 Final Version
Appliedmethodology/approach,whichpartsofthemethodologythathasbeenused andthe applicability of themethodology, its support of the process ofcross-bordernetworking
Effectivenessoftheappliedmethodologyanditssupportforthecross-bordernetworkingactivities
Theefficiencyoftheusageofthemethodology
Accumulatedprocessvalueandlearning
4.1.2 Methodology Evaluation Process
Theevaluationprocessofthemethodologyisdesignedasfollows:
1. Based on the ongoing activities in the work-package 2-5, the liaison personinforms the work-packages about the methodology and its support for their
currenttasks2. Theliaisonpersonparticipate inthework-packagemeetingsand listen tohow
theyhaveusedthemethodology
3. Theliaisonpersonaskquestionstomakesurethatthestagesofthecross-bordercollaborationmethodologyisdiscussedandvalidated
4. Theanswerstothequestionsisgatheredinthetemplatesuggestedbelow
5. Inaccordancetothevalidationresults,themethodologyisadjusted
6. Intheendoftheproject,themethodologyasawholeisevaluatedindiscussionbetween the experiment leaders andWP1. This is a task of which the liaison
personisresponsible.
7. Theresultsfromthemethodologyevaluationisgatheredinanevaluationreport
Themorespecifictemplateisfoundinsection7below.
4.2 Evaluation of the Thematic Experiments
In task 1.3 the aim is not only focused on providing a framework for evaluating themethodology, but also to design an evaluation framework for the thematic domainsexperimentstograsptheactivitiesandresultsfromtheseactivities.
Asmentioned previously,wehave chosena stakeholder-driven approach focusingon
empowering the stakeholders involved in the cross-border collaboration processes.With this approach, we will form a methodology that stem from the learning andexperiencesfromtheexperimentsintheAPOLLONproject.Basedonthedescriptionofwork,theevaluationsaimstoassessthestakeholdersdifferentperspectiveonworkingincross-bordernetworkingactivitiessupportedbyLivingLabs.
5.Research Framework for Evaluation of Cross-BorderNetworking Process within the Thematic Experiments
Tofacilitatetheevaluationofthedifferentexperiments,aresearchframeworkhasbeendevelopedtosupporttheplanningoftheexperiments.Thereforeinordertoeffectively
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
15/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 15 Final Version
apply the framework, we need to review APOLLON research framework and alignevaluationanddatacollectionprocesses.
TheAPOLLONresearchframeworkisappliedtothethematicexperimentsbyansweringthequestionsineachofthefollowingclasses,hence,thisresearchframeworkisusedasasupportfortheverticalwork-packageswhentheyplananddesigntheirexperiments.
Activities/Outputs Build Evaluate Justify Generalize
Constructs What are thevariables that youstudy?
What are theelements thatyoumeasure?
How do youdecide bestpractices acrosstheexperiments?
How do you filterpilot specificelementsout?
Model Whatarethebasic
assumptions,causalities andoutcomesthatyouperceive?
What measures
do you use toevaluate thevalidity of theassumptions?
What are the
success criteriathatyouuse?
Howdoyouassess
the widerapplicabilityofthemodel?
Method What is theprocess forvalidating theassumptions?
How do youevaluate andadjust thevalidationprocess?
Howdoyoujustifytheuseofselectedmethods?
How do youensure thescalability andwiderapplicabilityofthemethods?
Installation Who are thestakeholders atyourexperiment?
How do youevaluate addedvalue for eachstakeholder?
Howdoyoujustifythe selectedcollaborationmodel?
How do youcompilerecommendationsforsustainability
Figure2.Thematicexperimentsfocusandcontentcommunicatedincategoriesofactivitiesandoutputs
By applying this framework, the work-packages are facilitated in their process ofdefining the measures and key-performance indicators of each experiment. Thisinformationwill then be used as input to the evaluation framework of the thematicexperiments.TheanswerswillreflectthevariablesthataremeasuredinprojectlevelbyTasks X.4 in the thematic experiments. This collected data will be fed back to the
developmentofAPOLLONevaluationframework,andcontributetothecreationofthefinalversionofthedocument.
In this formative process we need to be in contact with the vertical experimentsregularly. We need your contribution and experiences in order to provide you withusableadviceoncollaborationpracticeswithintheirlivinglabnetwork.
Initially,weproposethefollowingpractices:
1. Requirementcollectionfromthematicexperiments
2. DedicatedWorkPackage1membersasliaisontoverticalexperiments
3.
Regularcollaborationandformalmeetingsforiterativeconceptvalidation4. Awikiasplatformtoshareinsightspracticesandwithverticalexperiments
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
16/38
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
17/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 17 Final Version
Thecross-bordernetworkingprocessinthedifferentexperimentaldomainswillbesupported and monitored by means of the developed cross-border networkingmethodology.Thismonitoringwillbesupportedbycontinuouslyevaluationoftheimplementednetworkingmethodologybasedonitsevaluationframeworkpresented
inforthcomingsectionswithinthisdeliverable.
6.Evaluation Template for the APOLLON Cross-BorderMethodology
Thissectionprovidesthetemplatesfordatacollectionwithineachexperimentinthethematic domains. For each vertical experiment, the evaluation framework will beappliedbytheliaisonpersoninaccordancetotheexperimentsongoingactivities.Hence,theactivitieswillbematchedtothephasesofthemethodologywhichare:Connect,Set
Boundaries&Engage,Support&Govern,andManage&Track.This template shouldbeviewedas a self-assessment framework, where the questionareasposedbelowareimplementedasanevaluationcarriedoutbytheliaisonpersoninthethematicexperimentsinthedifferentwork-packages.Theaimwiththistemplateisto facilitate knowledge sharing across the vertical domains and to support thedevelopmentofthemethodologyfromabottom-upapproach.
6.1 APOLLON methodology Evaluation Framework Template
Inthisframework,peopleinvolvedinthethematicexperimentsshouldcontributewiththeir experiences from their experiments in relation to the question areas suggested
below.Forexample,iftheexperimentisfocusedonsupportingandgoverningthecross-borderprocess,thetemplatefortheseactivitiesshouldbefilledincollaborativelybytheexperiment leader and the liaison person. The more specific questions within theparenthesisshouldbeconsideredasguidancetowhatkindofanswerthatissoughtforin the question. These donot have to be answered specifically. The answers to thequestionsarefilledincontinuously.Thefirstpartismoreoverarchingandshouldbefilledoutinalltheevaluationactivitiestodescribethecontextinwhichtheevaluationisperformed.
MethodologyEvaluationIn this section the aim is to evaluate the APOLLON cross-border methodology. Thisevaluation will be carried out continuously by the liaison person in collaboration anddialoguebetweenthedifferentwork-packagesandWP1.
Work-packagenumber
Experimentname(scope)
Introduction
Describetheobjectivewith
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
18/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 18 Final Version
theexperiments; whowasinvolved whatkindof
technology/knowledge/e
tcwastransferredinthecross-bordernetworkingprocess
Overarching activities
andexperiences
Describetheprocessofthecollaboration in theexperiment, what has beendone, how did you
communicated, whowas incharge of thetechnology/knowledgetransferetc:
partners
roles
What experiences aregained from involvingpartners from different
countries andorganisations? (Describepossiblelessonslearnedfromsharing knowledge andtechnology across borders.What kind of similarities,differences, problems,opportunities, strengths,weaknesses etc has beenexperiencedduringthe
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
19/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 19 Final Version
AssessingtheConnectPhase
Inthissection,thefocusistoevaluatetheconnectphase.Inthisphaseactivitiessuchassetting up Living Lab network, identifying stakeholders, creating work plans and vision,determine the scope of the collaboration, project owner, defining technical platforms,fundingandcontractsarecarriedout.
Note that all questions are not possible to answer in correlation to usage of themethodology,focusontheactualactivitiesthatwascarriedoutduringthiscollaborationphasewhichthenfunctionasinputtothefinaldesignoftheAPOLLONmethodology
QuestionArea LessonsLearned
How did the partners involved in the cross-border experiment get in contact with eachother?Consideractivitiessuchas:
SettingupLivingLabnetwork Identifyingstakeholders Creatingworkplansandvisions Determinethescopeofthecollaboration&
theprojectowner Definingtechnicalplatforms
Fundingandcontracts
Haveanypartsof theAPOLLONmethodologytosupport the process of connecting betweendifferentstakeholdersbeenused?(Ifnot,why?)(Ifso,describewhichpartsofthemethodologyhasbeenusedandhowtheyhavebeenimplemented)
How do the suggested tools and templatessupport the process of cross-bordercollaboration connecting between different
stakeholders?
What kindof support is neededwhen differentstakeholders want to get in contact with eachotherandtocollaborateacrossborders?
Dotheresourcesavailabletosupporttheconnectphasetobeasefficientaspossible?
(Considertowhatextenttheuseofresourceshasbeen used efficient in the process. Are there
anything that could have been carried outdifferently and more resource efficient? What is
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
20/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 20 Final Version
that?Howcoulditbeperformedinstead?)
AssessingtheSetBoundariesandEngagePhaseInthissection,thefocusistoevaluatetheconnectphase.Inthisphaseactivitiessuchasidentifying partners, identifying risks and drivers in the cross-border project, createmanagementplan/analyzestakeholders,traincross-borderpartners,definetechnicalandIPRissues,ensureprojectteamcommitmentarecarriedout.
Note that all questions are not possible to answer in correlation to usage of themethodology,focusontheactualactivitiesthatwascarriedoutduringthiscollaborationphasewhichthenfunctionasinputtothefinaldesignoftheAPOLLONmethodology
QuestionArea LessonsLearned
How were the set boundaries and engagephase carried out? Which activities arecommon when determining the scope of theproject as well as processes for creatingcommitmentamongpartners?Consideractivitiessuchas:
Identifyingpartner Identifying risks and drivers in the
cross-borderproject Create management plan / analyze
stakeholders Traincross-borderpartners DefinetechnicalandIPRissues Ensureprojectteamcommitment
HaveanypartsoftheAPOLLONmethodologytosupporttheprocessof set boundaries andengage between different stakeholders beenused?Ifnot,why?Ifso,describewhichpartsofthemethodology
has been used and how they have beenimplemented
Howdothesuggestedtoolsandtemplatesinthe APOLLON methodology support theprocess of setting boundaries and createengagement between different stakeholdersincross-bordernetworking?
Whatsupportforthisphaseisneeded?
Aretheresourcesavailabletosupportthesetboundaries and engage phase efficient as
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
21/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 21 Final Version
possible?
(Consider to whatextent the use of resourceshasbeenusedefficientintheprocess.Arethere
anything that could have been carried outdifferentlyandmoreresourceefficient?Whatisthat?Howcoulditbeperformedinstead?)
AssessingtheSupportandGovernPhase
Inthissection,thefocusistoevaluatetheconnectphase.Inthisphaseactivitiessuchasmanagingstakeholder,selectingresearchmethods,planningfinancialaspects,implementing
technical infrastructure, supporting deployment, designing evaluation frameworks,designingoperationalmodelarecarriedout.
Note that all questions are not possible to answer in correlation to usage of themethodology,focusontheactualactivitiesthatwascarriedoutduringthiscollaborationphasewhichthenfunctionasinputtothefinaldesignoftheAPOLLONmethodology
QuestionArea LessonsLearned
How was the support and govern phasecarried out? Which activities are common tosupport and govern the cross-border
collaborationprocessamongpartners?Consideractivitiessuchas:
Managingstakeholder Selectingresearchmethods Planningfinancialaspects Implementingtechnicalinfrastructure Supportingdeployment Designingevaluationframeworks Designingoperationalmodel
HaveanypartsoftheAPOLLONmethodologyto support the process of supporting andgovern the cross-border collaborationbetweendifferentstakeholdersbeenused?Ifnot,why?Ifso,describewhichpartsofthemethodologyhas been used and how they have beenimplemented
Howdothesuggestedtoolsandtemplatesinthe APOLLON methodology support the
process of support and govern cross-bordercollaboration between different stakeholders
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
22/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 22 Final Version
inthethematicdomains?
What kind of support is needed for thisphase?
Are the resources available to support thesupport and govern phase as efficient aspossible?
(Consider to whatextent the use of resourceshasbeenusedefficientintheprocess.Arethereanything that could have been carried outdifferentlyandmoreresourceefficient?Whatisthat?Howcoulditbeperformedinstead?)
AssessingtheManageandTrackPhase
Inthissection,thefocusistoevaluatetheconnectphase.Inthisphaseactivitiessuchasassessingimpact,revisingoperationalmodeplanningbusinessmodel,evaluatingusageoftechnicalplatformsandhandoverofresponsibilitiesfornewpilotarecarriedout.
Note that all questions are not possible to answer in correlation to usage of themethodology,focusontheactualactivitiesthatwascarriedoutduringthiscollaborationphasewhichthenfunctionasinputtothefinaldesignoftheAPOLLONmethodology
QuestionArea LessonsLearnedHowwastheManageandTrackphasecarriedout?Consideractivitiessuchas:
Assessingimpact Revisingoperationalmode Planningbusinessmodel Evaluatingusageoftechnicalplatforms Handover of responsibilities for new
pilot
Whichactivities arecommonwhenmanagingthe cross-border collaboration process amongpartners?Which activities are common when trackingthe results of a cross-border collaborationprocessamongpartners?
HaveanypartsoftheAPOLLONmethodologytosupport the process ofManageand Track
the cross-border collaboration betweendifferentstakeholdersbeenused?
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
23/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 23 Final Version
Ifnot,why?Ifso,describewhichpartsofthemethodologythat has been used and how they have beenimplemented
Howdothesuggestedtoolsandtemplatesinthe APOLLON methodology support theprocess of Managing and Tracking cross-border collaboration between differentstakeholdersinthethematicdomains?
What kind of support is needed for thisphase?
Are the resources available to support the
Managing and Track phase as efficient aspossible?
(Consider to whatextent the use of resourceshasbeenusedefficientintheprocess.Arethereanything that could have been carried outdifferentlyandmoreresourceefficient?Whatisthat?Howcoulditbeperformedinstead?)
7.
Template for Evaluating Cross-border NetworkingExperiments
Inthistemplate,theaimistosupporttheevaluationoftheimpactofthecross-bordernetworking activities carried out in the vertical experiments. Hence, this templatestrives to evaluate the added value of being involved in cross-border networkingactivitiesintheAPOLLONprojectfordifferentstakeholders.
Theobjectiveofthistemplateistosupporttheprocessofevaluatingthecross-bordernetworkingexperimentscarriedoutinthedifferentwork-packagestoassessthevalue
ofLivingLaboperatinginnetworksfortheinvolvedstakeholders.Thistemplateshouldbe used as an overarching framework that support the evaluation of the verticalexperiments,butitmustbecomplementedwithspecificquestionsforeachverticalexperiment to dealwith situational aspects. This template should be applied by theexperiment leaderswhen assessing their experiments. The aimof this template is toensurethatthesameareasareassessedinthethematicexperiments.AsastructureforthetemplatewehavechosentoapplythecomponentsofLivingLabmilieus,whichare:Usersandpartners,Management,Research,Innovation,ICTtoolsandinfrastructureandApproach(Bergvall-Krebornetal,2009).The key components of Living Labs are illustrated in figure 3.Approach stand for
methods and techniques that emerge as best practice within the Living Labsenvironment. The Living Lab Partners & Users bring their own specific wealth of
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
24/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 24 Final Version
knowledge and expertise to the collective, helping to achieve boundary spanningknowledgetransfer.The ICT&InfrastructurecomponentoutlinestherolethatnewandexistingICTtechnologycanplaytofacilitate newwaysofcooperatingandco-creatingnewinnovationsamongstakeholders.Research symbolizes thecollective learningand
reflectionthat takeplacein theLivingLab,andshouldresult incontributionstoboththeoryandpractice.Technologicalresearchpartnerscanalsoprovidedirectaccesstoresearch which can benefit the outcome of a technological innovation. Finally,Managementrepresenttheownership,organization,andpolicyaspectsofaLivingLab,aLiving Lab can be managed by e.g. consultants, companies or researchers (Bergvall-Krebornetal.,2009).
Figure3:LivingLabMilieuKeyComponents
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
25/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 25 Final Version
Evaluation Template of Cross-Border Networking
Experiments This evaluation framework aims to support the evaluation of thecross-border networking experiments. This template is divided into six differentsections;Background,Approach,Partners&Users,ICT&Infrastructure,Research,andManagement.
Related to eachsection,questions are askedwhere a valueneedsto answer, eachofthese value then needs to be related to a source where it can be recaptured (adeliverable, a data collection etc.), and finally the impact from the results should bestated.Theimpactisappreciatedbytheexperimentleadersinrelationtotheratiooftheordinaryvalues.Forinstance,ifamethodologyhasbeentransferredbetweenpartners,the output (effects marked in the grey area) of this methodology might be new
processesthatincreasedthenumberofsuccessfultechnologyimplementationwith5%inrelationtoordinaryimplementationratio.
Therewillalsobeanumberofquestionswhichdemandsanswersofmorequalitativecharacter;thesearerecognisedbythelargewritingsection.
BackgroundInformation
Inthesubsequentrowssomebackgrounddataisrequiredtosetheevaluationintherightcontext.
WPnumber
Experimentdescription
InvolvedPartners
Number of countriesinvolved in theexperiment
Type of cross-borderactivities that hasbeencarriedoutintheexperiments
Purpose of the cross-border activities(expectedoutcome)
Experiencedstrengths
of working in cross-border collaboration
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
26/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 26 Final Version
experiments
Experiencedchallenges ofworking
in cross-bordercollaborationexperiments
Approach
Approach refers to themethods and techniques that have been used to support thecross-bordercollaborationintheAPOLLONproject.Hence,ithasabroaderscopethanwhatisusuallyassessedinLivingLabactivities.
Inthefollowingtable,somequestionsrequirenumericalvaluewhileothersareofmoredescriptive character. Thus, not all questions will have a numerically measureableimpactbutifotherimpacthasbeenobservedtheseshouldbefilledin.
Theme Measures Value(output)
Measurement
tool
(where the datastem from, e.g.deliverablenumber,interviewetc)
Impact
(e.g. %ratio ofordinaryvalues, orqualitativeimpacts)
Approach
(The linesthat onlyhave onecolumn tofillinaimsatgatheringqualitative
data)
Noofcross-borderactivities
Noofintellectualproducts(methodologies,know-howetc)transferredintheexperiment
Nooftechnology
transferactivities
Whichmethodswereusedintheexperiment?Pleasenameand/orshortlydescribethemethods
PartnersandUsers
ThesectionPartners&Usersrefertothosewhohasbeeninvolvedandbroughttheir
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
27/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 27 Final Version
own specific wealth of knowledge and expertise to the project and thus, helped toachievecross-bordernetworkingexperiments.
Inthefollowingtable,somequestionsrequirenumericalvaluewhileothersareofmore
descriptive character. Thus, not all questions will have a numerically measureableimpactbutifotherimpacthasbeenobservedtheseshouldbefilled
Theme Measures Value(output) Measurement
tool
(where the datastem from, e.g.deliverablenumber,interviewetc)
Impact
(% ratioofordinaryvalues)
Partners:
Users(Thelinesthatonly have onecolumn to fillin aims atgatheringqualitativedata)
NoofUsersthathasbeeninvolvedintheexperiment
Noofuserinvolvementactivities
Noofnewideasthatemergedfromthecross-border
collaborationwithusers
Noofimplementationsofe.g.newfunctionsasaresultfromthecross-bordercollaborationwithusers
Noofredesignof
products/servicesasaresultfromthecross-bordercollaborationwithusers
User engagementactivities in detail(e.g. usabilityevaluation,behaviour change
studies, userexperience
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
28/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 28 Final Version
evaluationsetc)
Whatwas the usersrole in the cross-
bordercollaborationactivities?
PARTNERS:SME
Inthissection,theaimistoevaluatedtheSMEengagementandtheaddedvalueoftheirparticipationforthemasSMEs
Theme Measures Value(output) Measurement
tool
(where the datastem from, e.g.deliverablenumber,interviewetc)
Impact
(% ratio
ofordinaryvalues)
PARTNERS:
SME
(The linesthatonlyhaveonecolumntofill in aims atgatheringqualitativedata)
NoofSMEsinvolvedintheexperiment
NoSMEengagementactivities
Noofnewinternationalpartners
No of signed letterof intent betweenpartners and/orcustomers
No of newbusinessesgenerated in other
countries
No of new businessproposals
No of newcustomers in othercountries
Did the cross-bordercollaborationleadto
increasedturnover
Yes
No
Idonotknow
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
29/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 29 Final Version
Not relevantin thisexperiment
Did the cross-bordercollaborationleadtoincreased customerretention
Yes
No
Idonotknow
Not relevantin thisexperiment
SME engagementactivities in detail(e.g. developingtechnology, usertests,implementation oftechnologyetc)
Whatwastheroleofthe SME in thecross-bordercollaboration?
PARTNERS:LargeEnterprises
Inthissection,theaimistoevaluatetheLargeEnterprisesengagementandtheaddedvalueoftheirparticipationforthemasLargeEnterprise
LargeEnterprise
(Thelinesthat
only have onecolumn to fillin aims atgatheringqualitativedata)
No of LEs involvedintheexperiment
No LE engagementactivities
No of newinternationalpartners
No of signed letterof intent betweenpartners and/orcustomers
No of newbusinessesgenerated in othercountries
No of new business
proposals
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
30/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 30 Final Version
No of newcustomers in othercountries
Did the cross-bordercollaborationleadtoincreasedturnover
Yes
No
Idonotknow
Not relevantin thisexperiment
Did the cross-bordercollaborationleadtoiincreasedcustomerretention
Yes
No
Idonotknow
Not relevantin thisexperiment
LE engagementactivities in detail(e.g. developingtechnology,implementation of
experimentsetc)
What was the LErole in the cross-borderexperiment
PARTNERS:LocalAuthorities
Inthissection, theaim istoevaluate theLocalAuthoritiesengagementandtheaddedvalueoftheirparticipationforthemasLocalAuthorities
Theme Measures Value(output) Measurementtool
(where the datastem from, e.g.deliverablenumber,interviewetc)
Impact(% ratioofordinaryvalues)
Nooflocalauthoritiesinvolvedintheexperiment
Nolocalauthority
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
31/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 31 Final Version
PARTNERS:
LocalAuthorities
(Thelinesthatonly have onecolumn to fillin aims atgatheringqualitativedata)
engagementactivities
Noofnew
internationalpartners
Noofsignedletterofintentbetweenpartnersand/orcustomers
Noofnewbusinessesgeneratedinothercountries
Noofnewbusinessproposals
Noofnewcustomersinothercountries
Didthecross-bordercollaborationleadtoincreasedturnover
Yes
No
Idonotknow
Not relevantin thisexperiment
Didthecross-bordercollaborationleadtoincreasedcustomerretention
Yes
No
Idonotknow
Not relevantin this
experiment
Localauthorityengagementactivitiesindetail(e.g.implementationofexperiments,experimentalsettingsetc)
Whatwasthelocal
authoritiesroleinthecross-border
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
32/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 32 Final Version
TechnologyandInfrastructure
TheICT&Infrastructure componentoutlinestherolethatnewandexistingICTtechnologycan play to facilitate new ways of cooperating and co-creating new innovations amongstakeholders.
Inthe following table, somequestionsrequirenumerical valuewhileothersare ofmoredescriptivecharacter.Thus,notallquestionswillhaveanumericallymeasureableimpactbutifotherimpacthasbeenobservedtheseshouldbefilledin.
In the questions where answers of Yes andNo character are asked for, please respondaccordingtotheexperiencesfromtheexperiments
Theme Measures Value(output) Measure-
menttool
(wherethedatastem from, e.g.deliverablenumber,interviewetc)
Impact
(% ratio ofordinaryvalues)
Technologies
No of products thathas been transferredintheexperiment
No of cross-bordercollaboration toolsthat has been usedtheexperiment
No of NEW (for thestakeholders) ICT-
tools that has beenused in theexperiment
No of distributedcross-bordercollaborationactivities
Did the cross-bordercollaboration toolsyou used lead toincreased access torelevantinformation
Yes
No
Idonotknow
collaborationexperiment
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
33/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 33 Final Version
Wedidntuseanycollaborative
tools?Didthecross-bordercollaborationtoolsyouusedleadtoincreasedeffectivenessincommunication
Yes
No
Idonotknow
Wedidntuseanycollaborativetools?
Did the cross-bordercollaboration toolsyou used lead toincreased co-creation ofinnovations amongstakeholders
Yes
No
Idonotknow
Wedidntuseanycollaborativetools?
Which collaborationtoolshavebeenused
tosupportthecross-border collaborationintheexperiment
Research
ResearchsymbolizesthecollectivelearningandreflectionthattakeplaceintheLivingLab,andshouldresultincontributionstoboththeoryandpractice.
Inthefollowingtable,somequestionsrequirenumericalvaluewhileothersareofmore
descriptive character. Thus, not all questions will have a numerically measureableimpactbutifotherimpacthasbeenobservedtheseshouldbefilledin.
InthequestionswhereanswersofYesandNocharacterareaskedfor,pleaserespondaccording to the experiences from the experiments. This is not an exactmeasure, itratherstrivetogathertheimpressionsoftheimpact.
Theme Measures Value(output) Measurement
tool
(where the datastem from, e.g.
deliverablenumber,
Impact
(% ratioofordinary
values)
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
34/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 34 Final Version
interviewetc)
Research
Noofresearchactivitiesthathas
beenperformedduringtheexperiment
Noofauthoredjournalarticles
Noofauthoredconferencepapers
Noofresearch
conferencepresentations
Noofnewresearchprojectsinitiated
Didthecross-bordercollaborationleadtoincreasedcomparabilityofLivingLabresearch
Yes
No
Idonotknow
Notrelevant
forourexperiment
Wedidntdoanyresearch
Management
Managementrepresenttheownership,organization,andpolicyaspectsofLivingLabs.Inthisproject,theaimisalsotodefinetheroleoftheLivingLabinthecross-bordercollaboration aswell as the impact of theproject on local Living Labs aswell as theEnoLL.
Inthefollowingtable,somequestionsrequirenumericalvaluewhileothersareofmoredescriptive character. Thus, not all questions will have a numerically measureableimpactbutifotherimpacthasbeenobservedtheseshouldbefilledin.
InthequestionswhereanswersofYesandNocharacterareaskedfor,pleaserespondaccording to the experiences from the experiments. This is not an exactmeasure, itratherstrivetogathertheimpressionsoftheimpact.
Theme Measures Value(output) Measurementtool Impact(% ratio
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
35/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 35 Final Version
(where the datastem from, e.g.deliverable
number,interviewetc)
ofordinaryvalues)
MANAGEMENT:LivingLabManagementRole
No of Living Labsthat has beeninvolved in theexperiment
No of newcollaborationinitiatives betweenLiving Lab networkmembers (planned,prepared orsubmitted)
No of new LivingLab networkmembers
Didthecross-bordercollaborationlead
toincreasedaccesstousercommunitiesinothercountries?
Yes
No
Idonotknow
Notrelevantforourexperiment
Didthecross-bordercollaborationleadtoincreasedvaluepropositiontothe
stakeholdercommunity
Yes
No
Idonotknow
Notrelevantforourexperiment
Didthecross-bordercollaborationleadtoincreasedlearningofLivingLabcollaborationin
networks
Yes
No
Idonotknow
Notrelevant
forourexperiment
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
36/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 36 Final Version
Didthecross-bordercollaborationlead
toincreasedmaturityofLivingLabmanagement
Yes
No
Idonotknow
Notrelevantforourexperiment
The Living Labsrole in the cross-bordercollaborationactivities (what has
been theresponsibilities oftheLivingLab)
The experimentsimpact on localpolicies (describethe impact of theexperiment of localpolicies, both actualand expected
impact)
8.Template for APOLLON Experiment Specific KPI:sInthissectionwewanttheexperimentleaderstofillintherelevantandcontextdependentkey-performanceindicatorsforthespecificcross-bordercollaborationexperiment.TheseKPI:sshouldbeinconsistentwiththedescriptionoftheexperimentsindelx.2&x.3.
ExperimentSpecificKey-PerformanceIndicators
Inthissectiontheexperimentleaderandthetaskleaderofx.4shouldfillinthekey-performanceindicatorsthatarerelevantandspecificforeachindividualexperiment.
KPI
(AnoverarchingdescriptionoftheKeyPerformance
Measures
(Definethemeasureyouusetomeasurethe
Value(output)
Measure-
menttool
(wherethedatastem
Impact
(e.g.%ratioofordinaryvalues,orqualitative
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
37/38
Apollon Deliverable 1.3
ICT PSP Project Reporting Template 37 Final Version
Indicator) KPI) from,e.g.deliverablenumber,interviews
etc)
impacts)
8/7/2019 Apollon - Framework for Evaluation and Impact Assessment including KPI definition and measurement
38/38
Apollon Deliverable 1.3
ReferencesBenyon,D.,Turner,P.,andTurner,S.2005. Designing Interactive Systems. Edinburgh:
PearsonEducationLimited.
Bergvall-Kreborn, B., Ihlstrm Eriksson, C., Sthlbrst, A., and Svensson, J. 2009. AMilieu for Innovation - Defining Living Labs. The 2nd ISPIM InnovationSymposium - Stimulating Recovery - The Role of InnovationManagement. NewYorkCity,USA.6-9December2009
Crdoba, J., and Robson, W. 2003. Making the Evaluation of Information SystemsInsightful:UnderstandingtheRoleofPower-EthicsStrategies.ElectronicJournalof Information Systems Evaluation (2), http://www.ejise.com/volume6-issue2/vol6-i2-articles.htm.
Guba, E., and Lincoln, Y. 1989. Fourth Generation Evaluation. Newbury Park: SagePublicationsInc.
Karlsson,O.1999.Utvrdering-mernmetod.Editedbys.kommunfrbundet.Vol.3,JOUR.Stockholm:KommentusFrslag.
Lewis,J.2001.ReflectionsonEvaluationinPractice.Evaluation7(3):384-394.
Lundahl, C., and quist, O. 2002. Idn om en helhet - utvrdering p systemteoretiskgrund.LundStudentlitteratur.
Newman,W., andLamming,M.1995. Interactive System Design. Cambridge:Addison-WesleyPublisherLtd.
Patton,M.,Q.1990.Qualitativeevaluationandresearchmethods.2nded.NewburyPark:SagePublications.
Patton,M.,Q. . 1987. How toUse QualitativeMethods in Evaluation. California: SagePublications.
Perrin,B.2002.Howto-andHowNotto-EvaluateInnovation.Evaluation8(1):13-28.