+ All Categories
Home > Documents > PRODUCTIVE EVALUATION OF SOCIALLY ROBUST RESEARCH re-thinking evaluation TD-conference Bern,...

PRODUCTIVE EVALUATION OF SOCIALLY ROBUST RESEARCH re-thinking evaluation TD-conference Bern,...

Date post: 28-Dec-2015
Category:
Upload: vincent-mckinney
View: 214 times
Download: 0 times
Share this document with a friend
Popular Tags:
29
PRODUCTIVE EVALUATION OF SOCIALLY ROBUST RESEARCH re-thinking evaluation TD-conference Bern, 16-09-2011 Jack Spaapen
Transcript

PRODUCTIVE EVALUATION OF

SOCIALLY ROBUST RESEARCH

re-thinking evaluation

TD-conference Bern, 16-09-2011Jack Spaapen

2

FOUR SECTIONS

1. What kind of research are we talking about?

2. What are the problems when evaluating this kind of research?

3. Answers offered by SIAMPI (and ERiC) [productive interactions]

4. What is needed for a ‘productive’ evaluation of transdisciplinary , socially

robust research?

2

3

1. WHAT KIND OF RESEARCH ARE WE TALKING ABOUT?

Mode 2 (not purely academic, context of application, interaction with stakeholders, iterative process, interdisciplinary input technical expertise, content, use/behavior etc.)

“Knowledge” that emerges in TD networks is scientifically reliable, but also ‘socially robust’. Researchers are aware of social context.

Socially robust = relative to the social context, and liable to testing and validation by a variety of stakeholders.

Robustness is produced when research has been infiltrated and improved by social knowledge.

4

EXAMPLES OF TD/SR RESEARCH

Power plants by the Great Lakes - USA

Primary Health care in the Netherlands [SIAMPI]

‘Top sectors’ in the Netherlands [knowledge – skill – cash!]

Mining industry in Argentina [SIAMPI]

17th century theatre studies in Spain [SIAMPI]

Green cars [EU Framework Programs]

Creative industry, new media, (serious) games [top sector in NL]

Green meat production in 2020 in NL

Healthy aging [everywhere]

Water management (New Orleans)

4

TOP SECTORS, NEW DUTCH INNOVATION POLICY

10 topsectors: health, chemistry, high tech, agro-food, horticulture, water,

energy, creative industry, logistics and head offices

10 topteams, dominated by industry, but also sme’s, government, research

Innovation contracts before end of 2011

Reorientation of 500 M€, no ‘new’ money

Philosophy: strong in global competition, local/regional strength

(Wageningen food valley), knowledge skills cash

Main policy measures: fiscal advantages, export missions

5

6

MAIN CHARACTERISTICS OF TD/SR RESEARCH

Triple helix, golden triangle: research, industry, society (government, NGOs, general public), new collaborative arrangements: PPP

Input from different disciplines (science/technical fields, social sciences, humanities) and from a variety of stakeholders (technical knowledge, content, use/behavior)

Consensus about long term goals (“ clean energy”), but in the meantime shifting coalitions, different partners, different intermediate goals, different interests

Co-creation of ‘ knowledge’ or results (=something reliable that you need to gain insight and solve problems), research by design, iterative process (non-linear), using open source, open innovation

7

DIFFICULTIES WITH TD RESEARCH

Attuning different goals, expectations, interests of partners (big industry, SMEs, local/national/supranational politics, public debate)

Different academic disciplines having different "cultural" habits in publishing and organization etc., not independent of the surrounding society (publish or perish, ‘top sector’ policy, ethics, expectations)

Partners not used to work together in these transepistemic arenas (academics – industry, not knowing what to formulate as research questions), institutional problems, political and cultural problems….

“In conflict-intense policy contexts, research results are hard to establish” (Cozzens and Snoek 2010)

8

2. PROBLEMS WITH EVALUATION

Traditional ways don’t work (peer review, bibliometrics, patent analysis) – based on linear thinking and measure the wrong thing (journal and ranking- oriented)

New ways underdeveloped or to elaborative

- case studies (labor intensive, comparability)- (social) impact analysis (attribution, temporality)- gap between theory and practice

What to ‘measure’? (complex process, moving target)

Gap between theory and practice of evaluation

9

RIFT BETWEEN THEORY AND PRACTICE

The current state of the art in evaluation practice for measuring policy impacts does not match the concepts that are most common in the research literature to describe the connections between knowledge and policy.

The dominant concept in evaluation practice is linear, framed by logic models and the terminology of inputs, activities, outputs, and outcomes, sometimes with a loop back to planning. The policy process itself is a black box.

In contrast, the dominant concept in the research literature is the network or system, which is made up of many small conversations, interactions, and adjustments among a diverse set of actors; and complex concepts of the ebbs and flows of the policy process itself are incorporated. [Cozzens and Snoek, 2010]

10

INTERACTIVE PROCESS IN NANO RESEARCH (EE)SIAMPI ©TILO PROPP

10

11

KNOWLEDGE CIRCULATION IN ARCHITECTUREERIC PROJECT © PETER VD BESSELAAR

1212

Indicators to capture societal quality of Architecture and building research, proposed by mixed panel of researchers and stakeholders

1. Dissemination of knowledge into society: Professional publications, non-academic publications, exhibitions etc.

2. Spread of technology, artefacts, standards3. Advisory and consultancy activities4. Popularisation, education and contribution to public debate5. Professional training, mobility of graduates6. Master’s dissertations and graduation projects that address questions from

practitioners7. Interest of stakeholders 8. Number of researchers with relevant practical experience in the sector(s) that

the research programme targets9. Public funding related to societal issues10. Funds from contract research commissioned by potential users11. Collaboration with societal stakeholders on research, tests and evaluations12. Consortiums with non-academic organisations13. Impact and use of results Income from use of results14. Visibility in public debate/public media rankings

13

MANY ATTEMPTS TO BRIDGE THE GAP…..

REVIEW Cozzens and Snoek, 2010

Dutch Academy initiated studies (2005, 2008, 2010, 2011); national

developments in the Netherlands: Standard Evaluation Protocol (SEP),

Evaluating Research in Context (www.eric-project.nl)

Other European countries, Denmark (Hanne Foss Hansen), UK research councils, also in USA (broader impact criteria)

EU SIAMPI project on social impact measurement (2009-2011)

....BUT NOT VERY SUCCESSFUL YET

Lack of insight in the issues of interaction and collaboration in complex

constellations

Hesitating policy makers and research boards who love hard metrics, but at

the same time pressure for more ‘socially relevant’ research (valorisation).

Pressure from increasing global competition, politicians want excellent

research and profitable results (top sector policy)

14

METHODOLOGICAL PROBLEMS WITH ASSESSING TD/SR

- Attribution (who is responsible for which result in such TD clusters)

- Temporality (processes tend to continue over time, players trade places)

- What is good result for the one is not necessarily good for the other

- Different accountability and reward systems

- Lack of robust data

15

16

3. ERiC and SIAMPI projects

ERiC = Evaluating Research in Context, Dutch initiative to develop methods for social

impact evaluation

SIAMPI, FP7 project to develop methods for social impact assessment

http://www.siampi.eu/ • Focus on what goes on between research and context, what kind if

interaction, role of stakeholders, indicators?• Can we find ways to evaluate TD research and social impact of research• Feasibility

17

PRODUCTIVE INTERACTIONS

Productive interactions are exchanges of knowledge and expertise between science and society that result in behavioral change socially robust research

Change can be broad: from a different way of thinking, following a different method, extending or intensifying collaboration, to actual use by stakeholders (including other researchers)

‘productive’ = producing change in social relationsnot in an economic sense but in a social sense

18

THREE INDICATOR CATEGORIES OF PI

• Direct (personal) interactions : joint projects, advisory, consultancy,

double functions, mobility

• Indirect interactions : Texts : articles, books, catalogues, protocolsArtifacts : instruments, exhibitions, models, designs

• Money : contracts, subsidies, patenting, licensing

19

STEPS IN SIAMPI CASE STUDIES

1. Identify productive interactions and see what role they play in collaboration and achieving social impact

2. Construct different research ‘profiles’, based on productive interactions, recognition of different missions, policy contexts, etc.

3. Experimental development of indicators that represent the different profiles, do it together with researchers and stakeholder (focus groups), try and find some common indicators and some specific

4. Broad discussion of mission and strategy, learning type of evaluation

20

RESULTS PER CASE STUDY

We paid attention to differences in:

• National research cultures and policies• Research mode• Research type• Productive interactions• Social impact

21Research domain

nano ict Health care ssh

Country NL, France UK, NL, EU NL Esp, UK

Research type Frontier, basic, strategic

Basic, applied, TD strategic, applied, policy

Basic, strategic, applied

Research mode Academic, in collaboration with industry

Open to partnership of knowledge producers and users

Academic, open to collaboration with industry, govern-ment, patient groups, professio-nals

Academic, open to collaboration with policy, institutions, wider public, industry

Productive Interactions

Public understanding, ethical debates, policy making, products

Transport use, security, interaction between citizens and government

Consultation, colla-boration, regula-tions, proto-cols, commercial exchanges, PPPs, post academic training, patient organizations

Informal links and advice, formal research contract and collaborative projects, consultancy, cultural events

Social Impact Health, safety, public acceptance of nano tech

Transport use, security, interaction between citizens and government

Diagnostics, treatments, safety, general health, policy advice

Policy tools and techniques, management methods, cultural goods and services

22

HEALTH CASE: HIGHLY ORGANISED FORM OF SOCIALLY ROBUST RESEARCH

NIVEL mission includes applied research for policy makers

Stakeholder contacts actively organised by the board in a strategic consultancy process (top down)

Discussion of research results with stakeholders required

to safeguard financial supportto enhance chance of implementation of results

23

SOCIAL SCIENCES CASE: SERENDIPITUOUS PATTERN, BUT GOAL ORIENTED

BRASS, an ESRC funded research centre

General goal to achieve social impact , but researcher were also

motivated on a personal level

Multiple roles, part researcher, part political activist

Chance pattern, one project leads to another, almost by chance encounter

Political implications, not typical for SSH

24

NANO CASE: INTERMEDIAIRY CHECK POINTS

Promises and expectations galore in this new field

No direct links with end-users

Long networks of circulation

Intermediairy check points (proxy indicators) towards social relevance

25

ICT CASE: CONTRIBUTIONS INSTEAD OF ATTRIBUTION

Insight in who is contributing what is more effective than trying to reconst attribution over a long time

It took a decade from start of spin-off (1997) to develop semantic technology that was bought by market and used succesfully

Research system not patient enough, what can you say to the researcher who after five years and no concrete ‘results’ wants more money?

26

SOME CONCLUSIONS FROM THE CASES

Productive Interactions help to identify steps in producing socially robust research

Indicators can be developed, there is a data problem, but options are growing thanks to more awareness and internet and new databases

Great variety of interaction patterns (very incidental, personal and informal relations to highly organized and professionalized networks)

Policy conflicts, power differences important in evaluation

Research organisations have to make serious efforts to gather more robust data on social impact and on research output and outcome to wider audiences.

27

4. WHAT IS NECESSARY FOR THE EVALUATION OF SRR?

o A different perspective, non-linear, learning, new ‘peers’, administrators who dare to care

o Involvement of all stakeholders in development of criteria and indicators, acknowledging TD partners, raising commitment

o New type of self evaluation, focus on project as a whole, all participate and debate SWOT

o Serious investing in robust data collection

o Use / acceptance of these methods at all levels of research (institute, national, EU)

28

SIAMPI APPROACH TO EVALUATION OF TD/SR RESEARCH

1. Mission oriented, establish research profile and (intermediate)

research goals

2. Contextual, analyse stakeholder context, and engage

stakeholders, also in (self) evaluation process

3. Focus on productive interactions for (proxy) indicators

4. Use indicators not to judge, but to inform

5. TD group to write self evaluation incl. SWOT analysis

6. Build in feed back to all involved

7. The goal of the evaluation is learning and improving in stead of

accounting

29

EXAMPLE OF SOCIAL IMPACT OF THREE MEDICAL GUIDELINES FOR GPS [CONTEXTUAL RESPONSE ANALYSIS]© AD PRINS

Medical Journals

Professional information sites

Public or Patient information sites

Health Care Providers

Professional Organizations

Libraries

University & Research organizations

Chat, Blog's

0

20

40

DementiaEpicondylitisThyroid Function

Google searches for three guidelines of Leiden Dep. of Public Health and General Practice


Recommended