| 1| 1| 1
The basket of metrics in
research performance
Dr Lisa Colledge, Director of Research Metrics
21 January 2017
| 2| 2| 2
Spoiler
Are research metrics alone enough
to fully describe performance?No
Is expert opinion alone enough to
fully describe performance? No
Can performance be fully
described? Yes
| 3| 3| 3
When used with common sense, research metrics together with
qualitative input give a complete, balanced, multi-dimensional view of
performance
Two Golden Rules for using research metrics
Always use both qualitative
and quantitative input into
your decisions
Always use more than one
research metric as the
quantitative input
| 4| 4| 4
The basket of metrics is diverse…F.
Qu
alita
tive i
np
ut
Metric theme Metric sub-theme
A. Funding Awards
B. Outputs Productivity of research outputs
Visibility of communication channels
C. Research Impact Research influence
Knowledge transfer
D. Engagement Academic network
Non-academic network
Expertise transfer
E. Societal Impact Societal Impact
Number of metrics shown is indicative only. “S” denotes Snowball Metrics www.snowballmetrics.comS
S
S
S
S
S
S S
S
S
S
S
| 5| 5| 5
… and the diverse metrics are available for all entities
Outputse.g. article, research
data, blog, monograph
Custom set of outputse.g. funders’ output, articles I’ve reviewed
Researcher or group
Institution or group
Subject Area
Seriale.g. journal, proceedings
Portfolioe.g. publisher’s title list
Country or group
F.
Qu
alita
tive i
np
ut
Metric theme
A. Funding
B. Outputs
C. Research Impact
D. Engagement
E. Societal Impact
| 6| 6| 6
Users in different countries select different metrics
Metric World Australia Canada China Germany JapanUnited
KingdomUnited States
Field-Weighted Citation Impact
1 1 1 3 2 4 3 1
Outputs in Top Percentiles 2 2 3 1 4 1 1 6
Publications in Top Journal Percentiles
3 4 2 2 6 2 2 5
Collaboration 4 6 6 5 1 3 5 7
Citations per Publication 5 3 7 6 3 5 4 3
Citation Count 6 5 5 4 8 6 6 2
h-indices 7 7 4 8 7 7 7 4
Usage of metrics available in SciVal’s Benchmarking module from 11 March 2014 to 28 June 2015.
A partial list of the metrics available at that time is shown, focusing on the most frequently-used. Scholarly Output it
excluded since this is the default.
Note that recently added metrics based on e.g. media mentions and awards data were not available at this time and so are
not represented in this analysis.
| 7| 7| 7
Journals remain an important entity in the basket
CiteScore Impact Factor
A = citations to 3 years of documents A = citations to 2 or 5 years of documents
B = all documents indexed in Scopus, same as A B = only citable items (articles and reviews), different from
A (not editorials or letters-to-the-editor)
B
CiteScore 2015 value
B
=
ADocuments from 3 years
2012 2013 2014 2015 2016
A
2011
B
ADocuments from 2 years
Impact Factor 2015 value
B
=
A
Citations received in
2015
Citations received in
2015
New from Scopus
| 8| 8| 8
CiteScore metrics – free via journalmetrics.scopus.com
| 9| 9| 9
Users in different countries select different metrics
Metric World Australia Canada China Germany JapanUnited
KingdomUnited States
Field-Weighted Citation Impact
1 1 1 3 2 4 3 1
Outputs in Top Percentiles 2 2 3 1 4 1 1 6
Publications in Top Journal Percentiles
3 4 2 2 6 2 2 5
Collaboration 4 6 6 5 1 3 5 7
Citations per Publication 5 3 7 6 3 5 4 3
Citation Count 6 5 5 4 8 6 6 2
h-indices 7 7 4 8 7 7 7 4
Usage of metrics available in SciVal’s Benchmarking module from 11 March 2014 to 28 June 2015.
A partial list of the metrics available at that time is shown, focusing on the most frequently-used. Scholarly Output it
excluded since this is the default.
Note that recently added metrics based on e.g. media mentions and awards data were not available at this time and so are
not represented in this analysis.
| 10| 10| 10
How should the basket of metrics be used?
1. Define the question clearly, so that you can
2. Select appropriate metrics for the particular situation, and
3. Calculate metrics for the entities you are investigating, and
4. For suitable peers so you can benchmark performance
| 11| 11| 11
Evaluation and profiling both draw from the same basket
Evaluation(top-down)
Profiling(bottom-up)
Should ideally be openly communicated
A core set of metrics (KPIs)
– determined per evaluator SS
S
S
Supplementary sets per
discipline and entity-type
S
SS
S
Supplement with tailored metrics
– customized per individual entityS
Draw from the core set
– as relevant to the evaluatorS
S
S S
S
| 12| 12| 12
The basket of metrics: summary
Golden Rules: both expert opinion and research metrics
are needed to fully describe research performance
The basket of metrics is a “menu” that contains diverse
metrics for all entities
Metrics relevant to your question should be selected
from the basket, for both entities you are investigating
and suitable peers
The basket of metrics enables both evaluation and
profiling use cases
| 13| 13| 13
www.elsevier.com/research-intelligence
Thank you