SciVal& Snowball Metrics
Presented by
Jen Eidelman & Cyrill Walters
2014
Agenda• What are Snowball metrics?
• What is SciVal?
• How to access SciVal
• Overview module
• Benchmarking module
• Collaboration module
What are Snowball Metrics?
Snowball Metrics are a set of standard metrics to measure research performance.
For more information: http://www.snowballmetrics.com/
http://www.snowballmetrics.com/wp-content/uploads/flyer-2014.PDF
What is SciVal?
SciVal is a set of integrated modules that enables an institution to make evidence-based strategic decisions.
SciVal consists of three modules:
Overview - Get an overview of the research performance of your institution and others based on output, impact, and collaborations.
Benchmarking – Determine your strengths and weaknesses. Compare your research institution and teams to others based on performance metrics. Model different test scenarios.
Collaboration – Identify and analyze existing and potential collaboration opportunities. Identify suitable collaboration partners. See who others are collaborating with.
What is SciVal?
7
SciVal offers quick, easy access to the research performance of 220 nations
and 4,600 research institutions worldwide, and groups of countries and
institutions.
Overview Benchmark Collaboration
Ready-made-at a glance
snapshots of any
selected entity
Flexibility to create and
compare any research
groups
Identify and analyze
existing and potential
collaboration opportunities
How to access
LoginAccess: www.scival.com
Same login credentials as
ScienceDirect/Scopus
can be used
Click here if you forgot
your password
Click here to create a new login
Overview module
Summary tab – shows an institution’s disciplinary focus. Also answers the question: How can my institution evaluate the impact of its research portfolio?
Here you can see that UCT has averaged 9.1 citations per publication over a five-year time period.
Publications tab – shows how productive the institution is by using the scholarly output metric.
Answers the question: How can my institution demonstrate research excellence?Look at Outputs in top
percentiles. Look at Publications in top journal percentiles.
As you can see in the chart on the left, some 19.6% of University of CT publications were in the top 10 percentiles of the most cited publications worldwide.
In the chart below, 21.6% of the publications at UCT were published in the top 10 journals worldwide (measured by SNIP). SNIP (Source-Normalized Impact per Paper) This measures the citation impact of a journal.
Answer the question: Where does the institution have the highest impact?Select Publications, Select by Journal category.
Citations tab – highlights institutions citation impact.
‘Field weighted citation impact’ adjusts for the differences in citation behaviour across disciplines and puts everything on the same level.
The Field-Weighted Citation Impact is the number of total citations received divided by the total citations expected, based on the global average for the field.
• More than 1.00 means that citations are more than expected. • Less than 1.00 means the citations are less than expected.
The value of the world citation impact is 1. So this is showing you that over a 5 year period UCT’s citation impact was 1.74. Citations are 74% more than expected based on the global average.
Citations tab – highlights institutions citation impact.
Collaboration tab – gives a high level overview of how an entity collaborates geographically.
Collaboration tab – gives a high level overview of how an entity collaborates geographically.
Competencies tab: This is another way to show research strengths (competencies). Answers the question: How can my institution identify it’s research strengths?
Competencies tab – competencies map provides a view of worldwide research strengths. Bubbles represent research areas in which UCT is a global leader.
A competency shows where an institution has a leading position compared to other institutions in terms of • number of publications, • number of highly cited
publications or• innovation• the recentness of cited
publications.
Select any
desired
combination
of research
entities you
wish to
benchmark.
Select year range between
1996 and the current year.
Filter subject area
using 27 top level and
334 lower level subject
areas based on
Scopus ASJC.
SciVal metric guidebook
• http://www.elsevier.com/_data/assets/pdf_file/0006/184749/scival-metrics-guidebook-v1_01-february2014.pdf
Selection of metrics at your disposalProductivity metricsScholarly Outputh-indices (h, g, m)
Citation Impact metricsCitation CountCitations per PublicationCited Publicationsh-indices (h, g, m)Field-Weighted Citation ImpactPublications in Top PercentilesPublications in Top Journal PercentilesCollaboration Impact (geographical)Academic-Corporate Collaboration Impact
Collaboration metricsAuthorship CountNumber of Citing CountriesCollaboration (geographical)Academic-Corporate Collaboration
Disciplinarity metricsJournal countJournal category count
Combine any set of metrics
The only metrics rule: Triangulate!
• No single metric is perfect. Always give at least 2 metrics to provide insight into your question.
• No data set is going to give you a perfect answer.
• Reinforce your evidence-based conclusions with at least one of, and ideally both, peer review and expert opinion.
Selection of appropriate metrics
• Know your Question!
1. Performance evaluation of an institution, faculty, researcher etc.
2. Demonstration of excellence
3. Scenario Modelling• E.g. if modelling recruitment in biochemistry, might not need to worry about different
citation rates between fields because only looking at one field
“Non-performance variables”?
• Size• If your doing an evaluation this is important!
• Everything is not equal
• Use size-normalised metrics and percentages
• Discipline• Some disciplines have higher values due to their researcher behaviour
• Publication-type
• Database coverage
• Manipulation
• Time
Question 1
How does UCT perform relative to the University of Michigan?
Question 2
How does UCT perform relative to Stellenbosch University in Medicine?
Taking the disciplinary focuses into account by using the filter gives me other option for metrics to use, but why would I want to do this?
Field-Weighted Citation Impact
• Suitable for:• Fair comparison of entities regardless of size, disciplinary profile and publication-types
• Avoiding the crash in recent years
• Immediate understanding of extent of brilliance (1 = average)
• Selection in case of uncertainty
• However…• You lose the idea of the magnitude of your impact
• People might not like to see low numbers
• Always using FWCI as default severely restricts richness of information/data
• Complicated calculation that users will not be able to validate themselves
Defining a Research AreaSciVal offers a flexibility to create your own Research Areas, representing a field of research defined by you
Research Areas can represent a strategic priority, an emerging area of science, or any other topic of interest using below as the building blocks:
• Search terms - Search for publication sets using keyword(s)
• Entities - Select and combine any of the belowo Institutions (+ groups)o Countries (+ groups)o Journals and journal categories
• Competencies - Select and combine competencies of any desired institutions or countries
Select “Define a
new Research
Area”
Search by keywords
Apply filters to
refine search
result
Name it and save
Define your Research Areas
Defining a Research Area
SciVal Collaboration
ModuleHow can my institution find
collaboration partners?
International collaborations
can increase your impact and
visibility, which could lead
to more funding
opportunities.
How can you identify
suitable international
collaboration partners?
Which countries should we
focus on?
And which institutions are
active in which disciplines?
There are 207 institutions in Germany that are active in biotechnology but are not yet collaborating with UCT in this field. Click on the number.
Thank you!