| 1 | 1 | 1
Scopus as a bibliometrics tool:
CiteScore metrics, more metrics
& the importance of ranking
May 2017
Genevieve Musasa
Customer Consultant
for Africa
for ScienceDirect, Scopus
& Mendeley
| 2 | 2 | 2
Mounir El Bedraoui
Account Manager
French-Speaking Countries
in Africa
Sherif Ghazy
Account Manager
Sub-Sahara Africa
Karen Metcalf
Account Manager
South Africa
TITLE OF PRESENTATION | 3
3 |
Agenda
I. Introduction
1. Elsevier is empowering knowlegde
2. How to use research metrics appropriately
and why Scopus offers a basket of metrics
II. Journal metrics including CiteScore™ metrics
1. Why add CiteScore metrics to the basket?
2. What are CiteScore metrics?
3. How will CiteScore metrics be offered?
4. How does CiteScore compare to the Impact Factor?
5. Other journals metrics: SJR & SNIP
III. Article level metrics
Citation count, Field Weight Citation Impact, Scholarly
Activity online, Scholarly commentary online, Social activity
online, Media mentions
IV. Author profile & its algorithm
1. Scopus – ORCID integration
2. Author level metrics: Document count, h Index, Monitoring
your article via the article metrics
V. The importance of rankings
Dear valued Scientific and Academic Community,
here is a little word about myself:
I am one of Elsevier Customer Consultants for Africa. I am based in The
Netherlands, in the headquarter office of Elsevier. I am your dedicated expert
for three solutions of Elsevier: ScienceDirect, Scopus and Mendeley.
My aim is to help you creating more added value of these research solutions and
much more. I am responsible among others for customer engagements in nearly
all Africa, for conducting author workshops and trainings on those three
solutions.
I am working for Elsevier for more than 6 years of which nearly 3 years in this
role. Always dedicated and passionate, I won one of the Elsevier Worldwide
Customer Consultant Awards in 2015 and the Elsevier Customer Focus Value
Award for emerging markets in 2014.
Have a fruitful experience in reading this presentation!
| 5 | 5 | 5
I. Introduction
| 6 | 6 | 6
1. Elsevier is empowering knowledge
Our mission: Lead the way in science, technology and health
Galileo’s last and greatest work,
published in 1638 by Elzevir,
Discorsi e Dimostrazioni
Matematiche
Louis Pasteur
(Chemistry)
Alexander Fleming
(Medicine)
Albert Einstein (Physics)
Craig C Mello
(Medicine)
John C. Mather
(Physics)
Francoise Barre-Sinoussi
(Medicine)
Shinya Yamanaka (Medicine)
Marie Curie (Physics,
Chemistry)
Some of the Nobel Prize winners published with Elsevier:
7
Since the year 2000, 154 of the
155 Nobel laureates in science
and economics have published
in Elsevier journals. That’s
more than 99%
TRADITION | EXCELLENCE
437 Years | 137 Years
We commemorate the founding
of the House of Elzevier in
1580 and celebrate the
establishment of the Elsevier
company in 1880.
YEARS OF PUBLISHING
8
Elsevier, the modern publishing company, was founded in 1880. It has evolved from a
small Dutch publishing house devoted to classical scholarship into an international
multimedia publishing company.
Today, Elsevier is a world-leading provider of information solutions that enhance the
performance of science, health, and technology professionals, empowering them to make
better decisions, deliver better care, and sometimes make groundbreaking discoveries that
advance the boundaries of knowledge and human progress.
Elsevier provides workflow solutions and digital tools in the areas of strategic
research management, R&D performance, clinical decision support and professional
education. Elsevier publishes over 2,500 digitized journals, including The Lancet and Cell,
more than 35,000 book titles, and many iconic reference works, including Gray’s Anatomy
Learn more about our mission: “Leading the way in advancing science, technology and health”
Who is ELSEVIER ?
Novel Solutions that will enhance research
Paperless Office
(Databases) Physical Library
Integration
(Workflow
Tools)
Content Technology
and Analytics
Improved Outcomes
9
This is Elsevier
Professionals in science, technology, engineering and health have more information at their
disposal today than any time in history; yet understanding, discovery and knowledge are often
beyond reach. At Elsevier, we create the tools that make sense of information, to help make
better decisions, deliver better healthcare, save lives and make breakthrough discoveries
that advance science and society.
That means sorting through the overflow of information and choices to reveal knowledge that
helps to make critical decisions. We do this by applying smart technology to complex
problems, drawing from our unique foundation of authoritative information and structured data.
We apply advanced technology and analytics to filter, extract and learn from vast data sets,
social networks and collaboration platforms. We provide insight into global research
productivity, helping researchers find funding and collaborate with colleagues. We provide
the right clinical answers to physicians and nurses, shorten the path to actionable data for R&D
professionals, and build adaptive learning technologies to help students learn more effectively.
Quite simply, Elsevier is Empowering Knowledge.
Source: Read “This is Elsevier” brochure on www.elsevier.com
Watch the video” This is Elsevier”
10
TITLE OF PRESENTATION | 11
11 |
KNOWLEDGE:
Facts, information, and skills acquired through experience
or education; the theoretical or practical understanding of
a subject
The sum of what is known
= comprehension, mastery, command,…
INFORMATION:
Facts provided or learned about something or someone
What is conveyed or represented by a particular
arrangement or sequence of things
= details, figures, statistics, data
https://en.oxforddictionaries.com
Elsevier is Empowering Knowledge
Information Elsevier research productivity tools Knowledge
This is Elsevier
Digital | Data | Trusted | Global | Change | Leader
12
This is Elsevier
Digital | Data | Trusted | Global | Change | Leader
13
This is Elsevier
Digital | Data | Trusted | Global | Change | Leader
14
| 15 | 15 | 15
2. How to use research metrics
appropriately and why Scopus offers
a basket of metrics
| 16 | 16 | 16
When used correctly, research metrics together with qualitative input
give a balanced, multi-dimensional view for decision-making
Two Golden Rules for using research metrics
Always use both qualitative
and quantitative input into
your decisions
Always use more than one
research metric as the
quantitative input
| 17 | 17 | 17
A basket of metrics for research excellence
Theme Sub-theme
A. Funding Awards
B. Outputs Productivity of research outputs
Visibility of communication channels
C. Research Impact Research influence
Knowledge transfer
D. Engagement Academic network
Non-academic network
Expertise transfer
E. Societal Impact Societal Impact
Available for articles, researchers, journals, institutions, subject fields…
F.
Qu
ali
tati
ve
in
pu
t
| 18 | 18 | 18
Example: importance of using multiple metrics from
the basket - compensate for weaknesses
Compensates for differences in
field, type and age
Meaningful benchmark is “built in”
– 1 is average for a subject area
× People may not like small numbers
× Complicated; difficult to validate
× No idea of magnitude: how many
citations does it represent?
with
Large number
Simple, easy to validate
Communicates magnitude of
activity
× Affected by differences in field,
type and age
× Meaningless without additional
benchmarking
Field-Weighted Citation
Impact
= 2.53
Citations per Publication
= 27.8
| 19
A basket of metrics to facilitate the use of metrics
Entities Articles Custom
document set
Journals,
Conferences,
Books
Portfolio
Author,
Editor,
Reviewer
Institution or
group
Subject
area
| 20
A basket of metrics to facilitate the use of metrics
Usage
Citations
Audience
Patents
Scholarly Activity
Academic Opinion
Social Activity
Media Activity
Outputs
Funding awards
Editor
Board
Authors
Community Contributions Consumption Scholarly
Impact Social Impact
Type of
metrics
Entities Articles Custom
document set
Journals,
Conferences,
Books
Portfolio
Author,
Editor,
Reviewer
Institution or
group
Subject
area
| 21
New metric to be added to the basket
Usage
Citations
Audience
Patents
Scholarly Activity
Academic Opinion
Social Activity
Media Activity
Outputs
Funding awards
Editor
Board
Authors
Community Contributions Consumption Scholarly
Impact Social Impact
Type of
metrics
Geographical
spread
Collaboration
network
Sector
distribution
h-, g-, m- indices
Scholarly Output
Research data
output
Conference
output Citation counts
Usage counts
SNIP, SJR, IF
Audience
Scholarly
Discussion
Peer review
metrics
Prizes and
awards
Social media
mentions
Media mentions
Medical
guidelines
Influence policies
Mendeley
Counts
Individual
metrics
Funding sources
Patent metrics
Entities Articles Custom
document set
Journals,
Conferences,
Books
Portfolio
Author,
Editor,
Reviewer
Institution or
group
Subject
area
CiteScore
| 22 | 22 | 22
II. Journal metrics including
CiteScore™ metrics
| 23 | 23 | 23
1. Why add CiteScore metrics
to the basket?
| 24 | 24 | 24
• Journal metrics are still important complements to new and alternative metrics
• Many titles are missing transparent and replicable metrics that are easy to access
Journal metrics – still an important part of the basket
Q: How often do you use Journal Metrics
A Daily/Weekly
25%
B Monthly
21%
C Quarterly
25%
D Yearly
7%
E Not at all
21%
0
2
4
6
8
10
12
14
16
18
| 25 | 25 | 25
CiteScore is a simple metric for all Scopus journals
CiteScore Impact Factor
A = citations to 3 years of documents A = citations to 2 or 5 years of documents
B = all documents indexed in Scopus, same as A B = only citable items (articles and reviews),
different from A
Note: at launch, all titles in the May 2016 title list, and with some documents indexed in 2016, will have CiteScore metrics
B
CiteScore 2015 value
B
=
A
Documents from 3 years
2012 2013 2014 2015 2016
A
2011
TITLE OF PRESENTATION | 26
CITESCORE
https://libraryconnect.elsevier.com/metrics
# of documents in previous 3 years
citations in a year to documents published in previous 3 years
This comprehensive, current and open metric for journal
citation impact (introduced in December 2016) is available
in a free layer of Scopus.com. It includes a yearly release
and monthly CiteScore Tracker updates. Find CiteScore
metrics for journals, conference proceedings, book series
and trade journals at https://www.scopus.com/sources
TITLE OF PRESENTATION | 27
average # of weighted citations received in a year
# of documents published in previous 3 years
SCIMAGO JOURNAL RANK (SJR)
Citations are weighted – worth more or less – depending
on the source they come from. The subject field, quality
and reputation of the journal have a direct effect on the
value of a citation. Can be applied to journals, book
series and conference proceedings.
https://libraryconnect.elsevier.com/metrics
Calculated by SCImago Lab (http://www.scimagojr.com)
based on Scopus data.
TITLE OF PRESENTATION | 28
journal’s citation count per paper
citation potential in its subject field
SOURCE NORMALIZED IMPACT PER PAPER (SNIP)
The impact of a single citation will have a higher
value in subject areas where citations are less likely,
and vice versa. Stability intervals indicate the reliability
of the score. Smaller journals tend to have wider
stability intervals than larger journals.
https://libraryconnect.elsevier.com/metrics
Calculated by CWTS (http://www.journalindicators.com)
based on Scopus data.
| 29 | 29 | 29
Filling the gap in the Scopus basket of journal metrics
with SNIP and SJR CiteScore
and associated metrics
Compensates for differences in
field, type and age
Meaningful benchmark is “built in”
– 1 is average for a subject area
× People may not like small numbers
× Complicated; difficult to validate
× No idea of magnitude: how many
citations does it represent?
Large number
Simple, easy to validate
Communicates magnitude of
activity
× Affected by differences in field,
type and age
× Meaningless without additional
benchmarking
| 30 | 30 | 30
2. What are CiteScore metrics?
| 31 | 31 | 31
CiteScore is one of a family of related metrics
| 32 | 32 | 32
Each metric provides a complementary
measure of performance
Measures Validation in
Scopus?
Size-
normalized?
Subject field-
normalized?
Communicates
magnitude?
Update
frequency
CiteScore Citations per document Yes Yes No Yes
Annually,
and
monthly for
CiteScore
Tracker
metrics
CiteScore
Percentile
Relative position within
subject field based on
CiteScore
Yes Yes Yes No
Citation
Count
Raw impact of a journal
on the research
community
Yes No No Yes
Document
Count
Raw scale of a title within
the research community
Yes No No Yes
% cited Consistency with which a
title’s contents are
reliably cited
Yes Yes No No
SNIP Relative citations per
document
No Yes Yes No
Annually SJR Prestige of citing sources No Yes Yes No
| 33 | 33 | 33
3. How will the CiteScore
metrics be offered?
| 34 | 34 | 34
Journalmetrics.scopus.com website
Static values 2011-2015 for reporting, showcasing and exporting
| 35 | 35 | 35
Scopus.com: transparency, trends, and tracking current performance
1. CiteScore tab
| 36 | 36 | 36
Scopus.com: transparency, trends, and tracking current performance
2. CiteScore rank and trend tab
| 37 | 37 | 37
Scopus.com: transparency, trends, and tracking current performance
3. Scopus content coverage tab
| 38 | 38 | 38
Institutions monitor their researchers’ overall output
| 39 | 39 | 39
CiteScore metrics on Elsevier.com
| 40 | 40 | 40
4. How does CiteScore compare
to the Impact Factor?
TITLE OF PRESENTATION | 41
citations in a year to documents published in previous 2 years
JOURNAL IMPACT FACTOR
Based on Web of Science data, this metric is updated
once a year and traditionally released in June following
the year of coverage as part of the Journal Citation
Reports®. JCR also includes a Five-year Impact Factor.
https://libraryconnect.elsevier.com/metrics
# of citable items in previous 2 years
TITLE OF PRESENTATION | 42
42 |
Calculate the IF in Scopus
British Journal of Nutrition : IF 3.302
1. Go to advanced search in Scopus: SRCTITLE(xxx )
2. Limit your search to 2010+2011= B (number of documents published in
2010+11)
3. Select ALL titles and “view citation overview”
4. Look up total number of citations in 2012: A
5. Divide A/B and you receive the Impact factor
| 43 | 43 | 43
Advantages of CiteScore metrics
Current Transparent Comprehensive
Based on Scopus, the
world’s broadest abstract
and citation database
CiteScore metrics will be
available for all serial titles,
not just journals
CiteScore metrics could be
calculated for portfolios
CiteScore metrics will be
available for free
CiteScore metrics are easy
to calculate for yourself
The underlying database
is available for you to
interrogate
CiteScore Tracker is
updated monthly
New titles will have
CiteScore metrics the year
after they are indexed in
Scopus
| 44 | 44 | 44
CiteScore 2015 correlates 75% with Impact Factor
R² = 0.7524
-20
0
20
40
60
80
100
120
140
0 10 20 30 40 50 60 70
Im
pact F
acto
r 2015
CiteScore 2015
2015 Impact Factor and 2015 CiteScore
| 45 | 45 | 45
Journals with CiteScore cover all levels of performance
0
50
100
150
200
250
300
350
400
1 3 5 7 9
11
13
15
17
19
21
23
25
27
29
31
33
35
37
39
41
43
45
47
49
51
53
55
57
59
61
63
65
67
69
71
73
75
77
79
81
83
85
87
89
91
93
95
97
99
All CS %iles JIF %iles Unique CS %iles HIGH LOW
Num
be
r o
f jo
urn
als
• 22,256 titles have CiteScore 2015
• 22,620 titles have CiteScore Tracker 2016
| 46 | 46 | 46
Desirable characteristic CiteScore CiteScore Tracker Impact Factor
Metric measures citations per document Replicate strong
characteristics Simple method
Annual snapshot for reporting purposes
Document type consistency (num. and denom.) Improved
methodology
Fair compromise for all fields – 3y citation window
Derivative metric addresses disciplinary differences
Ongoing inclusion of error correction Comprehensive
Available for all serials indexed (not only journals)
Current New titles have the metric next calendar year
Tracking view for verification and decision making
Metric is current – updated monthly
Transparent
It’s calculated from the same database I use
Metric and derivative metrics are free
I can use a free widget on my webpage
Journal-level evaluation functionality is free
Underlying database available to verify calculation
Comparison of CiteScore™, CiteScore™ Tracker and Impact Factor
| 47 | 47 | 47
5. Other journals metrics:
SJR & SNIP
TITLE OF PRESENTATION | 48
48 |
SJR measures the prestige or influence of a scientific journal
SJR considers not only the raw number of citations
received by a journal…
but also the importance or influence of the source of those citations
SJR is a combination of the quantity & quality of the citations received
SCImago Journal Rank
On top of CiteScore, 2 other metrics to compare journals,SJR & SNIP:
TITLE OF PRESENTATION | 49
49 |
Source Normalized Impact per Paper
SNIP measures the contextual citation impact of a journal by normalizing citation values
SNIP takes a research field’s citation frequency and the database field’s coverage into account
It avoids delimitation and counters subject differences to balance the scales
SNIP shows differences due to journal quality and not citation behavior
On top of CiteScore, 2 other metrics to compare journals,SJR & SNIP:
| 50 | | 50
• Elsevier adopted these 2 metrics which counter some of the limitations of Impact Factor
• Source Normalized Impact per Paper (SNIP)
• SCImago Journal Rank (SJR)
• Corrections entail
1. Normalising across subjects (SNIP)
2. Weighting according to the citing journal (SJR)
The two different impact metrics are all based on methodologies developed by external bibliometricians and use Scopus as the data source.
More information available http://www.journalmetrics.com/
On top of CiteScore metrics: SJR & SNIP to compare journals
| 51 | 51 | 51
A. Journal metrics
Via the Scopus sources list
| 52 | | 52
Journal metrics in Scopus Sources: transparency, trends, and tracking current performance
| 53 | | 53
Journal metrics in Scopus Sources: transparency, trends, and tracking current performance
| 54 | | 54
Journal metrics in Scopus Sources: transparency, trends, and tracking current performance
| 55 | 55 | 55
Journal metrics in Scopus Sources:
transparency, trends, and tracking current performance
| 56 | 56 | 56
B. Journal metrics
via the “compare journal” feature
TITLE OF PRESENTATION | 57
57 |
The feature “Compare journal”: what is it?
It gives users a comparative overview of the journal landscape, showing how
titles in a given field are performing relative to each other
The objective data is presented in an easy, comprehensive graphical format
comparing citations of max. 10 journals from over 21,000 peer reviewed
journals from today all the way back to 1996
Data is updated bi-monthly to ensure currency
Its value for Administrators/Librarians
Identify journals and view their details and performance over time. Insuring you are
investing in the most influential and relevant journals
SNIP and SJR can also help you in your advisory role with your faculty to help them
identify the most impactful Journals even in niche areas
Its value for Researchers
Search for journals in a specific field, identify influential journal and who publishes them
Decide where to publish and get the best visibility for your work
TITLE OF PRESENTATION | 58
58 |
Journal metrics via the “compare journal” feature
TITLE OF PRESENTATION | 59
59 |
CiteScore measures
average citations
received per document
in the serial
Journal metrics in Scopus via the feature “compare journals”
TITLE OF PRESENTATION | 60
60 |
SJR is a prestige metric
and weights citations
according to the status the
citing journal
TITLE OF PRESENTATION | 61
61 |
SNIP normalized impact
per paper between
subject field.
TITLE OF PRESENTATION | 62
62 |
TITLE OF PRESENTATION | 63
63 |
“Using the Impact Factor alone to judge a journal is like using weight alone to judge
a person’s health.”
Source: The Joint Committee on Quantitative Assessment of Research: “Citation
Statistics”, a report from the International Mathematical Union
| 64 | 64 | 64
III. Article level metrics
| 65 | 65 | 65
• Citations
• Scholarly Activity (Mendeley readers + CiteULike saves)
• Scholarly Commentary
- Provided by Altmetric.com
- Measures scholarly conversation in Blogs, Post-publication
Peer-Review sites, wikipedia
• Mass Media
- Provided by Altmetric.com based on conversation in Reuters and
National Public Radio
• Social Activity
- Provided by altmetric.com
- Number of times an article has stimulated social media comment
- Current sources covered are: Twitter, Facebook, Google+,
Reddit, Pinterest
Article level metrics in Scopus
TITLE OF PRESENTATION | 66
# of citations accrued since publication
CITATION COUNT
A simple measure of attention for a particular article,
journal or researcher. As with all citation-based measures,
it is important to be aware of citation practices. The paper
“Effective Strategies for Increasing Citation Frequency”
(http://papers.ssrn.com/sol3/papers.cfm?abstract_id
=2344585) lists 33 different ways to increase citations.
https://libraryconnect.elsevier.com/metrics
TITLE OF PRESENTATION | 67
# of citations received by a document
expected # of citations for similar documents
FIELD-WEIGHTED CITATION IMPACT (FWCI)
Similar documents are ones in the same discipline,
of the same type (e.g., article, letter, review) and of the
same age. An FWCI of 1 means that the output performs
just as expected against the global average. More than
1 means that the output is more cited than expected
according to the global average; for example,
1.48 means 48% more cited than expected.
https://libraryconnect.elsevier.com/metrics
TITLE OF PRESENTATION | 68
# of users who added an article into their personal scholarly collaboration network library
SCHOLARLY ACTIVITY ONLINE
The website How Can I Share It? links to publisher
sharing policies, voluntary principles for article sharing
on scholarly collaboration networks, and places to share
that endorse these principles, including Mendeley, figshare,
SSRN and others. http://www.howcanishareit.com
https://libraryconnect.elsevier.com/metrics
TITLE OF PRESENTATION | 69
# of mentions in scientific blogs and/or academic websites
SCHOLARLY COMMENTARY ONLINE
Investigating beyond the count to actual mentions by
scholars could uncover possible future research
collaborators or opportunities to add to the promotion
and tenure portfolio. These mentions can be found in
the Scopus Article Metrics module and within free and
subscription altmetric tools and services.
https://libraryconnect.elsevier.com/metrics
TITLE OF PRESENTATION | 70
SOCIAL ACTIVITY ONLINE
https://libraryconnect.elsevier.com/metrics
# of mentions on micro-blogging sites
Micro-blogging sites may include Twitter, Facebook,
Google+ and others. Reporting on this attention is
becoming more common in academic CVs as a way
to supplement traditional citation-based metrics,
which may take years to accumulate. They may also
be open to gaming (http://www.altmetric.com/blog
gaming-altmetrics).
TITLE OF PRESENTATION | 71
# of mentions in mass or popular media
MEDIA MENTIONS
https://libraryconnect.elsevier.com/metrics
Media mentions are valued indicators of social impact
as they often highlight the potential impact of the
research on society. Sources could include an
institution’s press clipping service or an altmetric
provider. Mendeley, Scopus (Article Metrics module),
Pure and SciVal also report on mass media.
TITLE OF PRESENTATION | 72
compares items of same age, subject area & document type over an 18-month window
PERCENTILE BENCHMARK (ARTICLES)
https://libraryconnect.elsevier.com/metrics
The higher the percentile benchmark, the better. This
is available in Scopus for citations, and also for
Mendeley readership and tweets. Particularly useful
for authors as a way to contextualize citation counts
for journal articles as an indicator of academic impact.
| 73 Sales Training 101 |
Article level metrics in Scopus
• Scholarly Activity
(E.g Mendeley
readers)
• Scholarly
Commentary
- Altmetric.com
- Blogs, Wikipedia
• Mass Media
- Google+
| 74 Sales Training 101 | | 74
Article level metrics:
Social Activity
Scholarly Activity
Scholarly Commentary
Mass Media
| 75 Sales Training 101 |
| 76 Sales Training 101 |
| 77 Sales Training 101 |
IV. Author profile
| 78 Sales Training 101 |
6) CONCLUSIONS
Scopus is the only database provider to use a combination of algorithm and manual corrections on such a large A&I database.
As a result users benefit from some of the best author profiles in the industry
We realize that Author Profiles are not perfect, but we strive for the best possible quality .
Through constant algorithm enhancements, profile re-evaluations, manual feedback and professional profiling services (i.e. SciVal Experts) our profiles are increasing in both accuracy and completeness.
| 79 Sales Training 101 |
Scopus Profiles are…
• Comprehensive (~ 12 million Author Profiles and 8.5 million Affiliation Profiles)
• Easy to integrate (via RSS or the Scopus APIs)
• Widely used (interoperable with ORCID, VIVO)
• Are algorithmically created and can be manually updated and corrected (unlike the competition)
Two ways profiles are used…
For Research (qualitative) :
• I search for a paper and then use author names to look for more content so I can learn more.
• I look for an author so that I can see their work or collaborate with them.
For Metrics (quantitative) :
• I want to know how many papers a particular author puts out so I can measure them (or myself!).
| 80 Sales Training 101 |
The most powerful
ALGORITHMIC
data processing in the industry
MANUAL feedback via
the Author Feedback
Wizard
Groups papers to a profile with high degree of
accuracy based on matching of name, email,
affiliation, subject area, citations, co-authors,...
Combines the starting point from the algorithm
profiles and the manual feedback to create the
most accurate profiles with the least effort.
Scopus Author Profiles
The Universe of Research
| 81 Sales Training 101 |
‘Temporary’
profile
becomes
a
new
single
author
profile
‘Temporary’ profile gets merged with
existing profile
MERGE③
• Reference
• Source title
• Subj. area
• Co-Author
MERGE②
•Normalized
keywords
•Title
•Abstract
MERGE①
Strongest
criteria
Affiliation
FILTER
•Name
•Publication
Year Gap
•Co-Author
(e.g. now 300
matches
remaining)
‘Temporary’
profile
created
Search
existing
profiles &
find
candidates
(e.g. finds 1000
matches)
Incomin
g
Article
Remove
unmatched
profiles (e.g. 700)
New algorithm
enhancement
Current process
STEP1 STEP2 STEP3 STEP4 INPUT OUTPUT
UNCOMMON
NAME
COMMON NAME
An algorithm processes every new article…
Profile is searchable when a new article is being processed
UNCOMMON NAME
match match match
| 82 Sales Training 101 |
How do we look at Author Profile quality?
• Precision
• Precision is defined as average percentage (%) of articles that belong to same author profile
• Recall
• Recall is defined as average percentage (%) of author’s publications that are in the author’s largest single .5% profile
slide 82
Current Levels for entire dataset:
Precision – 97.24 (+/- 1.6) %, Recall – 92.59 (+/- 1.6) %
Expected improvements:
Precision - 99%, Recall – 95%
| 83 Sales Training 101 |
Why do we need feedback (in some cases) to get to 100% accuracy of profile?
slide 83
• Variations in input metadata (received from publishers) makes it impossible
to profile with 100% accuracy
• Authors with high frequency names publishing in same fields and in some
cases within same affiliation and department
• Author changing the field in which they publish
• Author moving to a different affiliation
• Women changing last name upon marriage
• Author publishing with a new name variant
• Mixed name styles: Hong Kong, Singapore, Chinese authors in Western
countries
• Variation in use of local or English language for affiliations in input articles
• Ex: Free Universities of Brussels (English) and Université Libre de
Bruxelles (French)
| 84 Sales Training 101 |
Request author details corrections
| 85 Sales Training 101 |
For simple merges and splits:
www.scopusfeedback.com
Turnaround time: 36 hours
Request author details corrections
| 86 Sales Training 101 |
1. Scopus – ORCID integration
| 87 Sales Training 101 |
Many researchers that too closely
resemble one another.
Dr. Smith Dr. Smith Dr. Smith
Researchers publish
under name variations.
Dr. Smith
Dr. J. Smith
Dr. James Smith
The Challenge: Scholarly Name Ambiguity
| 88 Sales Training 101 |
Dr. James Smith
46533489
ORCID Mission:
ORCID aims to solve the name
ambiguity problem in research
and scholarly communications by
creating a central registry of
unique identifiers for individual
researchers
The Solution: The ORCID Registry Original Researcher Contributor ID
Dr. Smith
Dr. J. Smith
Dr. James Smith
| 89 Sales Training 101 |
Enter via Scopus2ORCID Wizard or from ORCID
ORCID.org
Scopus2ORCID: Easy ORCID Set Up
orcid.scopusfeedback.com
| 90 Sales Training 101 |
2. Author level metrics
TITLE OF PRESENTATION | 91
# of items published by an individual or group of individuals
DOCUMENT COUNT
A researcher using document count should also provide
a list of document titles with links. If authors use an
ORCID iD – a persistent scholarly identifier – they can
draw on numerous sources for document count
including Scopus, ResearcherID, CrossRef and PubMed.
Register for an ORCID iD at http://orcid.org.
https://libraryconnect.elsevier.com/metrics
| 92 Sales Training 101 |
the h-index indicates both the number of publications and
the number of citations per publication
H-index
1. h-index : Measures the productivity and impact of a scientist’s published work
| 93 Sales Training 101 |
The h-index: Hirsch index or Hirsch number
In other words: An author has an index of
18 if he has published at least 18 papers;
each of which has been cited at least 18
times (Published by Jorge E. Hirsch in August 2005)
| 94 Sales Training 101 | 94
# of articles in the collection (h) that have received at least (h) citations over the whole period
h-INDEX
For example, an h-index of 8 means that 8 of the
collection’s articles have each received at least 8 citations.
h-index is not skewed by a single highly cited paper,
nor by a large number of poorly cited documents.
This flexible measure can be applied to any collection
of citable documents. Related h-type indices emphasize
other factors, such as newness or citing outputs’ own
citation counts (http://www.harzing.com/pop_hindex.htm).
https://libraryconnect.elsevier.com/metrics
| 95 Sales Training 101 |
How to track the impact of your publications?
2. The citations per year : the total number of citations received per year for an author’s published work
| 96 Sales Training 101 | | 96
How to track the impact of your publications?
3. By monitoring your article:
Social Activity
Scholarly Activity
Scholarly Commentary
Mass Media
TITLE OF PRESENTATION | 97
extent to which a research entity’s documents are present in the most-cited percentiles of a data universe
OUTPUTS IN TOP PERCENTILES
https://libraryconnect.elsevier.com/metrics
Found within SciVal, Outputs in Top Percentiles can
be field weighted. It indicates how many articles are
in the top 1%, 5%, 10% or 25% of the most cited
documents. Quick way to benchmark groups of
researchers.
| 98 | 98 | 98
V. The importance of rankings
| 99
Source: University selection by students (IDP Research)
The Russian 5/100 program aims to have at
least 5 Russian universities in the top 100
universities by 2020
Do rankings matter?
85% of students find university ranking
important in their selection of a university
33% of students find university ranking as
the most important factor, followed by
employer recognition 21%
Students & Parents
University Management
David Willets (former UK Universities & Science
minister):
“We broadly accept the criteria used by the THE,
which is why our policies are focused on the
same areas.” Policy Makers
| 100
All rankings have their strengths and potential
disadvantages and we do not rank the rankings!
• We believe in working on fundamentals with a “basket of
metrics”, always as a complement to peer opinion
• Informed decisions are better decisions
• Metrics should complement, not replace human judgment
• Well-selected metrics drive positive behaviors
• Metrics does not only mean bibliometrics
• Metrics can help monitor and eliminate biases
• Assessments are costly, but availability of new tools help bring
cost down
• Data sources to cover humanities are becoming more
complete
A quick recap on Elsevier’s position on overall position
on university rankings and metrics in general
| 101
Scopus data underpins
a portfolio of research solutions and tools
SCOPUS DATA
Scopus.com SciVal.com
Analytical
Services
APIs
Scopus
Custom Data
Pure
Mendeley
…
METRICS
RESEARCH OUTCOMES
*Analytical Services refers to the use of Scopus Custom data (and other data) in
reports, assessment exercises, rankings and other Custom Data commercial projects.
| 102
Bibliometric data providers and ranking agencies
• ~30% of ranking agencies use bibliometric data of which 90% use citations in some form. The
70% not relying on bibliometric data rely a.o. on (student) surveys, research funding, admissions
and entrances
• 3 global ranking agencies dominate: ARWU, QS and THE. QS and THE use Scopus as the
exclusive source for bibliometric data. ARWU uses TR data for world ranking but Scopus for China
rankings
• Of the 15 global ranking agencies that use bibliometric data 8 are using Scopus of which 5
exclusive.
Scopus unique Shared with TR
Scopus provides bibliometric data for these rankings:
| 103
A closer look at…
• Use Scopus:
- 2007-2009: Under QS
partnership
- Since Oct. 2014: Directly from
Elsevier
• Rankings existing: World University,
Reputation, Asia, Top 100 Under 50
(Young Universities), Africa
Universities, BRICS & Emerging
Economies, Rankings by Subject
• Data provided: citation score,
number of papers per faculty, number
of internationally co-authored papers
• Affiliations: SciVal institutions
• Support:
- Reputation data: Elsevier runs
the reputation survey using
Elsevier author list for THE
- Affiliation handling: Affiliation
corrections, mergers, split, etc.
handled with THE for the
respective universities
| 104
A closer look at…
• Uses Scopus since 2007
• Rankings: World University, World
University Rankings by Subject, Asia,
Arab Region, BRICs, EECA, Latin
America, Top 50 Under 50
• Data provided:
- Scopus raw data
- Sharing SciVal institution profiles
- (Re)classification of
multidisciplinary articles
• Affiliations: Scopus profiles
• Support is provided at the level of
Scopus data corrections incl. those
related to affiliations
| 105 | 105 | 105
Appendix
| 106
Ongoing content curation of the Scopus base to ensure
continuous high quality content
Identification of poor
performing journals
using metrics and
benchmarks
“Radar” to predict
journals with outlier
performance
Direct feedback from
users and
stakeholders on poor
performing journals
Re-evaluation by the Content Selection & Advisory Board (CSAB)
Content Curation
Curation of the full Scopus journal base is essential and expected by
our customers and users.
Review:
Curate:
| 107
Evolution of research metrics and analytical tools to
track your impact in Scopus
2016 – Journal metrics module
2015 – Article-level metrics module
2014 – SciVal launches
2013 – Increased Export limits
2012 – Modified SNIP & SJR, Altmetric, Analyze results
2011 – Export refine
2010 – SNIP & SJR Journal Metrics
2009 – Author Evaluator
2008 – Journal Analyzer
2007 – h-index graph
2006 – Citation Overview (Citation Tracker)
2004 – Scopus launches
| 108 | 108 | 108
Get Involved
https://blog.scopus.com/get-involved
| 109 | 109 | 109
Thank you!
[..] the intertwined tree and vine represent
a fruitful relationship […]
the logo represents […] the symbiotic
relationship between publisher and
scholar. The addition of the Non Solus
inscription reinforces the message that
publishers, like the elm tree, are needed to
provide sturdy support for scholars, just as
surely as scholars, the vine, are needed to
produce fruit.
Publishers and scholars cannot do it
alone. They need each other. This
remains as apt a representation of the
relationship between Elsevier and its
authors today – neither dependent, nor
independent, but interdependent.
Non Solus – Not Alone