Date post: | 28-Mar-2015 |
Category: |
Documents |
Upload: | jack-pruitt |
View: | 218 times |
Download: | 2 times |
Measuring the Innovation Return on S&T Investments
Janet E. Halliwell
Presentation overview The big picture - measuring the return on
S&T investments (ROI)Context, challenges and issues, variables,
methodologies, trendsInnovation and what our understanding
means for assessing ROIThe data challengeWrap-up
The ROI contextIncreasing pressure to measure impacts of public S&T investments
(nothing new here!)What do researchers tend think about (especially for P&T)?
Inputs (e.g., funding raised)Research productivityNumbers of HQP Perhaps commercialization
What are institutions most interested in?Competitiveness for students, prestige and fundsCosts - impacts on their bottom line
What is the larger public interest?Quality of the PSE systemLarger social and economic impacts from R&D and service
ROI measurement challengesVery diverse languages and expectations on what is meant by
ROINo universal framework or universally applied methodologiesMeasurement of ROI needs to encompass diverse dimensions of
impact: Economic (e.g. jobs, new products and services, spin off
companies, business process change) Social and health (e.g. changes in policy and practice, improved
outcomes, costs avoided) Environmental (e.g. reduced footprint and environmental impact,
branding Canada green) In addition to practical stumbling blocks of measurement of
these , interpretation of any measures is non-trivialAnd all of the above does not necessarily measure the full
impact on innovation or the innovation system
Some issuesROI measurement requires us to think about what
happens down the value chain as a result of the research and research related activities - beyond the quality, volume and influence of research on other research – e.g. what difference did this S&T investment make in the real world
ROI measurement is NOT a classical economic I/O study (which measures the flow of monies resulting from an activity regardless of what that activity is)
A theoretically-sound ROI method is poor if key stakeholders are not consulted or don’t understand it
Variables to think aboutYour audience/targetScope and level of aggregationDistance down the value chain of outputs,
outcomes and impactsTime scale (how far back)MethodologiesDesired detail; how to communicate (e.g.
visualize)Balancing accuracy, longevity, comparability
and ease of collection of metrics
Downstream measurment …Categories relevant to innovation include (at
least):Direct – Using research findings for better ideas,
products, processes, policies, practice etc.Indirect – From participation in the research,
including HQP training, KT, tacit knowledge, better management, etc.
Spin-off – Using findings in unexpected ways and fields
Knock-on – Arising far after the research is doneAlso very important – outcomes that foster an
environment in which innovation flourishes
Example methodologiesQuantitative
SurveysBibliometrics, including publication counts, citation analysis, data
mining, international collaboration analysis, social network analysis Technometrics, including hotspot patent – publication linkagesEconomic rate of return – micro and macro levelsSociometricsBenchmarking
Qualitative Peer Review, Merit/Expert ReviewCase study method – exploratory, descriptive, explanatoryRetrospective documentary analysis Interviews
Mixed models (e.g. Payback, OMS, CAHS)
TrendsMixed methods Increasing attention to networks, linkages,
collaborationsGlobal frame of referenceInvolvement of stakeholders inside and
external to R&D unit External focus – e.g. short-term external
impacts for industry or government, rather than immediate outcomes for the R&D organization
What is innovation?Doing new things in new ways
Tom Jenkins
“Innovation” is (or should be) a very broad term, BUT …Many studies focus only on the easiest metrics to measure
– not innovation relevant issuesOr, they focus exclusively on industrial impacts such as
sales of new productsSo ….
Encourage other routes and types of impactsAttempt to measure them, including cost savingsEncourage “end-goal” thinking
And remember Innovation comes in many different guises, e.g.Incremental innovationIntegrated innovationOpen innovationInformal innovationSocial innovationDesign as innovation
Consider measurement in the innovation context
The macro level messagesMeasurement is important – but NOT just any
measurementNeed to ground measurement in a strong conceptual
framework connecting activities ultimate goals, intended uses, and both targeted users (logic models)
Then look at relationships of outcomes and impacts with innovation in your sector or sphere of activity
Measurement is BOTH qualitative and quantitativeProper measurement often takes deliberate effort and
time
Why quantitative and qualitativeQuantitative for understanding reach, scope,
& importance of impacts Qualitative for the how and why of impacts ,
barriers & solutions, incrementality and attribution, government, societal and environmental effects, etc.
There is no such thing as a purely quantitative system that measures full impacts of S&T
Reporting SE impacts
Impact Investigated and Described in:
Narrative fashion Quantitative terms Dollar terms
Qualitative
Quantitative
Economic
And …Consider “outcome mapping” a la IDRC –
where the focus is on people and organizations.
Go beyond simply assessing the products of a project/program to focus on changes in behaviours, relationships, actions, and/or activities of the people and organizations with whom a program works
This is a learning-based and use-driven approach
Recognizes the importance of active engagement of players in adaptation and innovation – “productive interactions”
The data challenge (1)Need a good MIS at levels of researcher and
project/activity - one that connects with your Network/Centre goals
Need to integrate in the MIS the needs of your reporting requirements, accountability plans and Centre/Network self monitoring/self-learning
Tie the MIS to performance measurement system by having automatic reports produced, red flags etc
Need a foundation of data standards
The data challenge (2)Standards can help:
Reduce burden on researchersFacilitate the interface with fundersAccess cost effective software solutionsComparisons with self over timeComparisons with other institutionsInternational benchmarking
CASRAI is a large part of the standards picture
Customized and flexible methodologies
There are plenty of metrics and methods availableNo need to invent any more (although you will likely need
to intensify your data collection)It’s how metrics and narrative are used and combined
that make the differenceNo “one size fits all” methods or metrics work for all types
of S&T, for all types of organizations, or for all uses and users
All methods have substantial strengths and substantial weaknesses
Involve key stakeholders Remember that innovation requires many players, not
just the R&D team
Measurement can make a differenceAccountability and advocacy - Making the case
on the basis of outcomes and impacts:For overall program fundingFor the nature and dynamics of the staff
complementSelf awareness and understanding:
Internal - Strengths, weaknesses, gapsExternal – Threats, opportunities Forward directions/areas needing attention
Fine tuning the strategic visionFostering sustainable relationships
Finally …To achieve these objectives, you need:Good (and visionary) governanceGood management - capable staff complement Robust database with in-house expertiseActive engagement of players in using the
outcomes measures for adaptation and innovation
Measurement is “quantum”– It changes the system; you tend to get what you ask people
to measure