Presented to members of the Psychology department as part of the New Tricks Seminar series (February 2016)
• journal metrics using WoS and Scopus
• article level metrics in WoS, Scopus and Google Scholar, and from publishers and the differences in each. Touch on altmetrics.
• author metrics in the above. Touch on Publish or Perish
Tanya Williamson, Academic Liaison Librarian
2. Overview
Learn about the most common bibliometrics, how to find them,
the data they are based on, and their limitations.
1. What is citation analysis?
2. Journal metrics
3. Author metrics
4. Article level metrics and altmetrics
3. • Citation analysis is the quantitative analysis of research
publications and citations
• Based on the assumption that citations = academic impact
• Bibliometrics are the measures produced by citation analysis
What is citation analysis?
Citation counts in Web of Science Graphic from Eigenfactor.org
Citation counts in Scopus
Article-level metrics in Scopus
4. SciVal and InCites are research intelligence tools based on citation data +
Key tools for finding bibliometric data
Product Pros Cons
Web of
Science
(incl. Journal
Citation
Reports)
• Citation data back to 1945
• Produces the Journal Impact Factor
• Can compare journals within subject
category
• Less intuitive user interface
• English language bias
• Excludes some new OA journals
• Journal Citation Reports does not include
journals which are solely concerned with
Arts & Humanities
Scopus • Broader coverage than WoS
• Produces IPP and SNIP
• Source data for the REF and World
University Rankings
• Can compare journals across subject
• Citation data back to 1996 (backfilling to
1972)
• Journal comparisons limited to 10 titles
• No subject category rank
• English language bias
Google
Scholar
• Free and easy to use
• Very broad coverage i.e. yields high results
• Good coverage of grey literature and books
• No clear coverage policy
• Author metrics rely on authors creating
profiles or using additional software e.g.
Publish or Perish
• ‘Top journals’ by language and H-index
5. An attempt to compare the influence and impact of journal titles in a particular
discipline based on citations received.
Can be useful to inform decisions on strategic publishing alongside judgements
about reaching the best audience for the research.
Commonly used metrics
• Journal Impact Factor available through Journal Citation Reports, based on
Web of Science data
• SJR (SCImago Journal rank) based on Scopus data
• SNIP (Source Normalised Impact per Paper) based on Scopus data
Journal metrics
7. The Journal Impact Factor is the average number of times articles
from the Journal published in the past two years have been cited
in the JCR year.
Also available: 5 year Journal Impact Factor
Journal Impact Factor
An example: Cognition
8. • Based on source data from Web Of
Science
• “The theory behind the eigenfactor
metrics is that a single citation from
a high-quality journal may hold
more value than multiple citations
from more peripheral publications.”
Eigenfactor and Article Influence
An example: Cognition
Demo of Journal Citation Reports
9. • Source Normalised Impact per Paper (SNIP)– enables
comparison between disciplines with different citation
conventions
• Impact Per Paper – same method as the JIF with different
source data, and based on 3 years of citations
Scopus - SNIP and IPP
10. Scopus – Compare Journals
Select and compare up to
10 journals
Demo of Scopus
11. SCImago journal rank
An example: Developmental and Educational Psychology
• SCImago Journal Rank uses source data from Scopus
• Based on the Google PageRank algorithm
• Weights citations from high prestige journals and attempts to
balance the influence of the size of a journal
Developmental and Educational Psychology category rank
12. • Journal Impact Factor doesn’t work (at all/well) for every
discipline due to different publishing patterns and coverage of
different publication types
• Journal Impact Factor mustn’t be used to compare different
disciplines
• Focusing only on journal metrics could lead to you overlooking
smaller, specialist or emerging publications
• Dependent on the coverage and biases of the source data i.e.
will include only citations from other items in the database
• The overuse of the JIF has encouraged gaming and false
precision
• Can we infer individual impact from journal impact?
Limitations of journal metrics
13. Key metrics
• Citation counts - uses a particular dataset to count how many
documents an author has had published, and how many
citations those documents have received over time
• H-index, devised by: J. E. Hirsch in 2005
• There are several ‘improvements’ on the h-index, e.g. the g-
index, the hc-index, the hIindex, the ACW index but none has
gained the same popularity as the h-index
Author metrics
Easy explanation:
If a scholar has 20 articles which have each
been cited 20 times, s/he has an h-index of 20
16. Google Scholar
• A way to find your own h-index
• The author needs to have a
Google Scholar profile
Author metrics from
Google Scholar
Publish or Perish
• Uses Google Scholar data
• Many nuanced author
metrics
• Needs careful checking of
results for duplicates and
false results
Demo of Google Scholar
17. • Name ambiguity – getting a comprehensive, accurate list is not
easy, even in Web of Science
• No one metric can capture all citations or ‘impacts’
• May not be enough publications to be valid
• Early career researchers will be at a disadvantage
• Results will vary based on the source data
• Citations ≠ endorsements of quality!
Limitations of author metrics
Resistance against the
h-index, ImpactStory
18. • Citation data in Web of Science, Scholar, Google Scholar
• Publishers’ interfaces often include citing articles
Metrics
• Total cites
• Cites per year
• Average cites per paper, or per year
• Field Weighted Citation Impact
• Benchmark percentile (based on age of paper and subject area)
• Usage: downloads and views
• Emerging ‘altmetrics’
Article level metrics
24. “Article-Level Metrics (ALMs) leverage
the acceleration of research
communication made possible by the
networked landscape of researcher
tools and services. Also by
incorporating the manifold ways in
which research is disseminated, these
article impact indicators are made
available rapidly after publication and
are continually updated.”
PLOS One Article Level Metrics Information
Frontiers: Article level metrics for an OA
article published in Frontiers in Psychology
Article level metrics
25. Alternative metrics or
altmetrics
ImpactStory: like a living publications CV.
Adding metrics all the time
https://impactstory.org/metrics
• Emerging alternative metrics which capture the
attention that articles/works receive online
• Shows the broader, societal attention which may or
may not translate into citations
Loop: metrics presented on an author’s profile
26. Altmetrics
Altmetric.com: watch social media sites,
newspapers, government policy
documents and other sources for
mentions of scholarly articles to
compile article level metrics.
27. Example: A highly cited clinical article from 1995?
Limitations of article level
and altmetrics
Source of data Citations to article
Web of Science 2214
Google Scholar 3184
Scopus 2500
Publisher’s website 1626
Example:
Martinez, F. D., Wright, A. L., Taussig, L. M., Holberg, C. J., Halonen, M., &
Morgan, W. J. (1995). Asthma and wheezing in the first six years of life. New
England Journal of Medicine, 332(3), 133-138.
Citation data from 10/2/2016
28. • There are multiple sources of citation data to consider
• No single source is perfect, all have different coverage
• Citation counts/metrics alone only tell part of the story
• Publishing and citing patterns are different in different
disciplines (and even within disciplines like Psychology)
• ‘Outputs’ other than journal articles and conference
proceedings may not be well-represented
• Altmetrics give a view of the online attention surrounding a
work
Summary
29. • Journal metrics (Elsevier)
• The State of Journal Evaluation (Thomson Reuters)
• Publish or Perish (Harzing)
• Measure Your Research Impact toolkit (Irish academic libraries)
• Leiden manifesto as published in Hicks, Diana, et al. "The Leiden Manifesto
for research metrics." Nature 520 (2015): 429-431.
• PlumX from Plum Analytics: another altmetrics tool not discussed in this
presentation
The University has a current subscription to SciVal and InCites, which are
research intelligence tools which include benchmarking based on citation
data from Scopus and WoS respectively.
Follow up resources
Hinweis der Redaktion
Learn about the most common bibliometrics used to measure research impact and influence. Have a go at finding a journal's impact factor and calculating the h-index.
This is a complex area, I do not aim to teach you how to be bibliometricians – I’m not one myself!
Ask: I’d like to gauge where your priorities lie for the session today. Is there something specific you hope to get out of the session?
This is a complex area, I do not aim to teach you how to be bibliometricians – I’m not one myself!
There are metrics that you can access, with source data, which can help you to gain a picture of research impact. Ideally bibliometrics should be used alongside other ‘measures’ of impact or quality, which include:
-- successful funding applications
-- influence on policy/society
-- peer review
-- analysis of public engagement
We will look into a range of metrics and their limitations
Two versions of JCR – basic and InCites (graphical) not much difference, basic is probably easier, though InCites allows you to use BOTH Sci Index and SSci Index
journal impact factors in the Journal Citation Reports database which ranks journals using citation data from ISI Citation Indexes in the Web of Knowledge based on the previous 2 years
5 year impact factor which extends the citation data over 5 years
Eigenfactor score which aims to measure total influence by the use of an algorithm which includes the journal impact factor
SCImago (see mah go)
The Impact Factor is calculated by dividing the number of citations in the JCR year by the total number of articles published in the two previous years.
An Impact Factor of 1.0 means that, on average, the articles published one or two year ago have been cited one time. An Impact Factor of 2.5 means that, on average, the articles published one or two years ago have been cited two and a half times.
The citing works may be articles published in the same journal. However, most citing works are from different journals, proceedings, or books indexed by Web of Science.
Eigenfactor scores are scaled so that the sum of the Eigenfactor scores of all journals listed in Thomson's Journal Citation Reports (JCR) is 100. In 2006, the journal Nature has the highest Eigenfactor score, with a score of 1.992. The top thousand journals, as ranked by Eigenfactor score, all have Eigenfactor scores above 0.01.
A journal's Article Influence score is a measure of the average influence of each of its articles over the first five years after publication.
Article Influence score measures the average influence, per article, of the papers in a journal. As such, it is comparable to Thomson Scientific's widely-used Impact Factor. Article Influence scores are normalized so that the mean article in the entire Thomson Journal Citation Reports (JCR) database has an article influence of 1.00.
In 2006, the top journal by Article Influence score is Annual Reviews of Immunology, with an article influence of 27.454. This means that the average article in that journal has twenty seven times the influence of the mean journal in the JCR.
SNIP is the ratio of a source's average citation count per paper and the citation potential of its subject field.
Si-MAhG-o
Global data – country ranks
Common one is the H-index is a peculiar measure. Anybody heard of it? What does it mean? What value is placed on it?
Not an average, (though you can get an average citations per article from WoS)
Demo WoS
Topic: wheezing in children
Sort: most highly cited
Choose author: Martinez
Create Citation Report
Mention that you would actually have to work a little harder to ensure that you have everything in the database by this author.
Possible to create a Scholar account which enables you to (and others if the account is public) to view citations to your work, and citation analysis.
The i10-index indicates the number of academic publications an author has written that have at least ten citations from others. It was introduced in July 2011 by Google as part of their work on Google Scholar, a search engine dedicated to academic and related papers.
Growing area of development in bibliometrics, that goes hand in hand with the Open Access movement. Rather than using the journal metrics as a proxy for article quality, it’s now possible to split that link and focus on the actual article – Have been able to do this in Web of Science for a long time.
Highlight differences in the count.
From OA journal PLOS One (Public Library of Science). No citations from other
Altmetric.com focuses, like traditional metrics on journal articles.
ImpactStory looks at datasets, slides, videos, patents etc etc, not just articles
(>2000 cites) Older articles won’t necessarily get fresh attention.
Which source do you trust?