How to measure the impact of Research ?

Post on 15-Apr-2017

220 views 1 download

transcript

How to measure the impact of research?

Anne-Catherine Rota, Research Intelligence, Elsevier

7 December, 2015

Elsevier, Partner for Research

Access to full text of 2,500 journals and 11,000 eBooks revues

The largest bibliographic database , 22,000 active journals

| 3 | 3 | 3

Analyze research: what for?

Monitor the trends in my own research area

Position myself / my team / my institution

Evaluate

Get more funding

Being more visible

| 4 | 4 | 4

The French research landscape: EU funding

H2020: Funding allocated per country – 4,320 contracts signed - July 15, 2015

FP7: Funding allocated per country (2007-2013)

Contribution to the EU budget (2011)

| 5 | 5 | 5

Scopus, the output database

22 000 journals from 5,000 publishers

o A well-balanced coverage in terms of research fields

o A well-balanced coverage for locations

o IndependantContent Selection Adivsory Board

o 56 million articles

o 6,9 million conference papers

o + 90 000 eBooks

o “Articles in Press” from 5,000 titles

o +3,473 Open Access journals

Elsevier journals:

only 10% of

indexed journals

in Scopus

88%

12% Scopus

Non-Scopus

Top 100 institutions

| 6 | 6 | 6

Among Scopus users

2015 Best Chinese University Ranking report (évaluation par le Shanghai Ranking Consultancy’s)

International rankings

Public policies, research assessments and funding agencies

Canadian Institutes of Health Research

U.S. National Science Foundation (NSF) Selects Elsevier Data to Support its Science and Engineering Indicators http://www.elsevier.com/about/press-releases/science-and-technology/u.s.-national-science-foundation-nsf-selects -elsevier-data-to-support-its-science-and-engineering-indicators

Research metrics: best practice in their use in merit systems

7

Research metrics can be used inappropriately

- One metric used in isolation often leads to undesirable outcomes

Research metrics can be manipulated

- Normal research practice can be abused if you really try

Also true of peer review - How can research metrics be used with

similar trust and benefit?

All merit systems should be based on multiple types of measurement: • Peer review

• Expert opinion

• Narrative

• Research Metrics

Research Metrics input should always rely on at least 2 metrics

- Reduces opportunities to game and can drive desirable outcomes

Research metrics indicate many different ways of being good

8

Research Metrics input should always rely on at least 2 metrics that ideally reflect distinct ways of performing excellently

Within Research Area of “Sitting”: Output, citations and views show different ways of being excellent

Most

productive

Most cited

Most viewed

Viewing activity is complementary to being cited

Recent publication viewed, even though not cited yet

Erratum not (expected to be) cited but actively looked at

This publication is of global interest

Snowball Metrics in a nutshell

12

• Bottom-up initiative: universities endorse metrics valuable to

them to generate a strategic dashboard

• Aim: enable benchmarking by driving quality and efficiency

across higher education’s research and enterprise activities

• Draw on all data available: university, commercial, public

• Build on existing definitions and standards where sensible

• Output: free recipes (methods) that can be used by anyone

• www.snowballmetrics.com for more info and Recipe Book

UK group

Societal-Economic Impact

13

Recipes: partner with businesses

Contract Research

Spin-Off-Related Finances IP Volume / Licenses

Academic-Corporate Collaboration

Altmetrics: esteem and impact

15

Scholarly Activity: posts in

scholarly tools

Social Activity: social

media posts

Scholarly Commentary:

comments in scholarly tools

Mass Media: references

from newspapers etc.

Altmetrics - Esteem Altmetrics - Impact

Altmetrics in Scopus

16

| 17 | | 17

Societal Economic Impact in SciVal - Patent Article Citations

| 18 | | 18

Societal Economic Impact in Scival - Patent Article Citations

Societal-economic impact

19

Partner with businesses

• Academic-Corporate Collaboration

• Business Consultancy Activities

• Volume & financial benefits of IP &

spin-outs

Esteem

• Scholarly Activity

• Scholarly Commentary

Impact

• Social Activity

• Mass Media

• Public Engagement

Brain and Neuroscience research is highly collaborative

20

Inte

rna

tio

na

l c

oll

ab

ora

tio

n (

% o

f to

tal o

utp

ut)

Brain and Neuroscience Researchers: cross-disciplinary mobility

| 22

Measure interdisciplinarity

| 23

Four ingredients to be gatherer in order to analyze research

1. Data

3. Metrics

4. Tools

2. IT system

Quality & reliability

Transparency

Granularity

as hoc services & APIs

Immediate access

| 24

Documents under initial ENPC profile: 2,007

Scopus optimized identification of French affiliations

| 25 | 25 | 25

| 26 | 26 | 26

• Offer a basket of metrics

• That can be generated in an automated and scalable way, so they are available for all peers

• Do not make decisions on behalf of users

• Provide options and enable users to decide approach

• Absolute transparency so you can make your own judgments about whether something is appropriate, or has been manipulated

• No methodological black boxes – no exceptions

• Transparency about underlying data contents and structure

• Constructed in a reasoned way

Research metrics: best practice in their supply

27

Absolute transparency from your metrics supplier is essential

28

http://bit.ly/scivalmetricsguidebook

http://bit.ly/usageguidebook

http://bit.ly/hefceresponse

Presented By

Date

Thank you for your time and attention