One of the basic functions of the SciELO Publishing Model is to follow up the performance of journals, national collections, the network and the overall program. In the context of national collections, which in most cases are financed by public resources and are highly selective regarding indexing, the good performance of journals is expected in line with the specific objectives of SciELO to contribute to their sustainable increase of editorial quality, visibility, use and impact. In addition to the specific objectives that apply to the entire network, national collections are governed by priorities determined by national policies and conditions.
The performance of journals and SciELO collections are evaluated by the following criteria:
• Institutionality, which refers to institutions responsible for the journals and their respective research communities as indicators of credibility and operational sustainability of journals;
• Good practices of editing and scholarly communication, which refers to the adherence to the SciELO indexing criteria that implies in adherence to the good practices of scholarly communication and adoption of innovations;
• Visibility, Use and Impact, which refer to the following contexts:
◦ Access and downloads indicators to articles’ full text files in HTML and PDF formats;
◦ Citations indicators or metrics considering different journal indexes;
◦ Web presence indicators or altmetrics.
The scope proposed for this working group encompasses the analysis of journals performance in accordance with the above criteria, taking into account the specificities of different thematic areas and different countries. The analysis and discussion of these three dimensions will be conducted by scholarly communication and bibliometrics experts with the support of representatives of national collections, journal editors and specialists.
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Stephanie Faulkner - Understanding the performance of SciELO journals based on CiteScore, citation index comparisons and PlumX metrics
1. | www.plumanalytics.com1
Understanding the
performance of SciELO
journals based on CiteScore,
citation index comparisons
and PlumX metrics
24 September 2018
Stephanie Faulkner | Product Manager
S.Faulkner@Elsevier.com | +1.732.216.5104
2. | www.plumanalytics.com2
Supporting and developing metrics for the full
lifecycle of research
We strive for broader use of
open metrics that can be
used by everyone
responsibly.
Research Metrics at Elsevier
2
Our Approach
Elsevier Research Metrics Guidebook: https://www.elsevier.com/research-
intelligence/resource-library/research-metrics-guidebook
4. | www.plumanalytics.com4
CiteScore
• CiteScore is a simple way of measuring the
citation impact of serial titles such as journals.
Serial titles are defined as titles which publish
on a regular basis (i.e. one or more volumes
per year).
https://journalmetrics.scopus.com/index.php/Faqs
8. | www.plumanalytics.com8
CiteScore Tracker
• CiteScore Tracker is calculated in the same way as
CiteScore, but for the current year rather than previous,
complete years. The CiteScore Tracker calculation is
updated every month, as a current indication of a title's
performance.
15. | www.plumanalytics.com15
CiteScore highlights for SciELO
- 9 journals are in the 75-89th percentile:
- Clinics (88%)
- Revista Brasileira de Politica Internacional (86%)
- Journal of Venomous Animals and Toxins Including
Tropical Diseases (85%)
- Revista Brasiliera de Parasitologia Veterinaria (82%)
- Journal of Applied Oral Science (79%)
- Anais de Academia Brasiliera de Ciecneias (78%)
- Brazilian Archives of Biology and Technology (76%)
- Rodriguesia (75%)
- Scientia Agricola (75%)
16. | www.plumanalytics.com16
CiteScore highlights for SciELO
• 49 journals rank in the second quartile (74%-
50%)
• 136 journals rank in the third quartile (49%-
25%)
• 109 journals rank in the 4th quartile (24%-0%)
19. | www.plumanalytics.com19
Defining altmetrics
• Altmetrics = Alternative ways of measuring the
use and impact of scholarship
• Altmetrics combine traditional impact measures
(citation counts) with non-traditional measures
• Altmetrics = ALL METRICS
“Altmetrics are measures of
scholarly impact mined from
activity in online tools and
environments” – Jason Priem,
Co-Founder, ImpactStory
Using Altmetrics to Illustrate the Impact of Open Access
on Graduate Student Research (Barnett, Collister, Chan 2014)
20. | www.plumanalytics.com20
USAGE
(clicks, downloads, views,
library holdings, video plays)
CAPTURES
(bookmarks, code forks, favorites,
readers, watchers)
MENTIONS
(blog/news posts, comments,
reviews, Wikipedia links)
SOCIAL MEDIA
(tweets, likes, shares,
comments)
CITATIONS
(citation indexes, patent
citations, clinical citations)
Metrics
Categories
21. | www.plumanalytics.com21
USAGE
Clicks, Downloads, Views,
Library Holdings, Video
plays
• Is anyone reading our work?
• Did anyone watch our videos?
• Usage is the #1 stat researchers want to know
after Citation counts
• PlumX is the only product that includes Usage
23. | www.plumanalytics.com23
MENTIONS
Blog posts, News posts,
Book reviews, Wikipedia
references
• This category measures people truly engaging
with your research
• Automatically uncover the conversations about
your research
• Discover feedback, opinions, etc.
24. | www.plumanalytics.com24
SOCIAL MEDIA
Shares, likes, & comments,
Tweets
• Social media measures how well a publisher or
researcher is promoting their work
• Track the buzz and attention around your
research output
• This is especially important for early career
researchers to measure and understand
41. | www.plumanalytics.com41
Summary
• CiteScore metrics help you measure citation impact
in a comprehensive and transparent way.
• CiteScore Tracker provides you a way to see how
your journal is performing in the current year.
• CiteScore Rank & Trends lets you see how your
journal ranks against other journals in the same
category.
• PlumX Metrics and the SciELO PlumX Dashboard
gives you article-level metrics, organized by
country/journal and subject area/journal.
• The Subgroup Overview in the PlumX Dashboard
allows you to benchmark by country, by journals in a
country and by journals in a subject area, helping
identify high and low performers and find insights
about what is driving high performance.
Our Metrics Manifesto
Need to use different metrics and common sense
Decisions should be based on both quantitative and qualitative input
Should always use at least two metrics (more than one way to ‘excellence’)
The methodologies should be open, transparent, valid and replicable
The Elsevier Research Metrics Guidebook is a great resource as well. https://www.elsevier.com/research-intelligence/resource-library/research-metrics-guidebook
Today, I’m going to talk about CiteScore for journal impact
PlumX Metrics for article and journal impact
Impact metrics have been around since 1920s. Eugene Garfield came up with the Impact Factor in the 1960s. A commercially available version was produced in 1975. It remained the standard measure of journal impact/performance for about 3 decades, until in 2005, Eigenfactor, SNIP and SJR came along.
In 2015,Elsevier wanted to provide an alternative impact metric which complements SNIP and SJR metrics to provide more robust indicators for a serial title’s performance on Scopus.
CiteScore metrics are a new standard to help you measure citation impact for journals, book series, conference proceedings and trade journals. They are comprehensive, transparent, current and free calculated using data from Scopus.
CiteScore does not discriminate: if a title can be cited, CiteScore will count it.
CiteScore metrics incorporate eight different metrics:
CiteScore
CiteScore Tracker
CiteScore Percentile
CiteScore Quartiles
CiteScore Rank
Citation Counts
Document Counts
Percentage Cited
The method of calculation for CiteScore 2017 is illustrated here.
CiteScore calculates the average number of citations received in a calendar year (the numerator) by all items published in that journal in the preceding three years (the denominator). The calendar year to which a title's issues are assigned is determined by their cover dates, and not the dates that the issues were made available online.
On this public source page in Scopus, you can see the CiteScore for 2017 and how it was calculated. Everything is linked so you can see the citations and documents used to calculate the score.
CiteScore Tracker is calculated in the same way as CiteScore, but for the current year rather than previous, complete years. The CiteScore Tracker calculation is updated every month, as a current indication of a title's performance.
So on the same Scopus Source page, below the CiteScore, you see the 2018 tracker. The tracker allows you to monitor and see how your CiteScore result is building for the next year are building in real-time.
You also see the SJR and the SNIP from the Scopus Source details page which is publicly available. All of these contribute to Elsevier’s basket of metrics.
There is also a widget you can use to embed CiteScore into your website.
Above the CiteScore, there’s a link to view CiteScore Rank and Trends.
In the analysis module of Scopus, you can actually compare journals using CiteScore, Snip, SJR and Citations.
You can see how your journal ranks against other journals in the same category. In this example, I used CiteScore as the metric for comparison.
Remember, this is all publicly available – you don’t need a Scopus subscription to see your journal’s metrics.
Also on the Scopus source details page is a link to “compare sources”. Clicking on the compare sources link let you compare that journal to other journals indexed in Scopus using, a variety of metrics.
You can choose other journals indexed in Scopus to benchmark against and compare based on CiteScore, SJR, SNIP, Citations and more. In this example, I’m comparing the Revista de Saude Publica to the Revista Portuguese de Saude Publica based on CiteScore.
We performed an informal analysis of SciELO journals and their CiteScores. We looked at 193 journals on the SciELO and Scopus platforms and to see how these journals fared looking at percentiles and quartiles. The percentile ranking indicates the relative standing of a serial title in it’s subject field. This is the only SciELO journal in the top 10%. That means that this journal is ranked higher than 90% of the journals in the same category.
Also in the first quartile you have an additional 9 journals. So 10 of the 193 journals are in the 1st quartile based on their CiteScore.
Another 49 journals make it into the second quartile which is 74% to 50%.
136 journals were in the 3rd quartile (49%-25%) and 109 in the 4th quartile (24% – 0).
I really encourage you to view your journal’s CiteScore, the tracker and even run some comparisons to see how your journals are performing against others in the same subject area.
Even other publishers are talking about CiteScore. Here’s a blog post from Taylof & Francis talking about CiteScore.
In addition to looking at journal impact via CiteScore, you will also have the ability to view your journal’s impact using PlumX Metrics and a PlumX Dashboard. PlumX Metrics provide a comprehensive view of article level metrics. Before we look at SciELO specific PlumX Metrics, I want to give you a brief overview of PlumX Metrics.
Before I go into detail about PlumX Metrics, I want to talk to you about how we define “Altmetrics”. We believe we represent ALL METRICS, not just alternative metrics. And PlumX is the only metrics provider that includes usage.
We developed 5 categories of metrics to represent the data exhaust that we are collecting. Usage, Captures, Mentions, Social Media and Citations. I will give you a brief overview of each category.
Usage is very important – knowing how many times a particular article or artifact is downloaded, viewed or played (if it’s a video). One of the usage metrics we have for books is how many libraries hold a book.
Captures goes beyond viewing the artifact. Someone is saving it or bookmarking it for later or adding it to Mendeley of CiteULike. This indicates that the person wants to come back to this piece of research. Several studies have shown that captures are really good indicator of future citations. People save and store work that they will come back to when they are ready to write their paper or create their research output.
Mentions show engagement – someone is commenting, blogging or reviewing research or even referencing it in a Wikipedia article. It’s where we say the stories are hidden.
Many people think of altmetrics synonymously with social media. We think social media is a great way of measuring how work is promoted – by the researcher or the publisher.
Citations are a part of the overall impact assessment and are important to consider alongside the other metric types we capture. In addition to traditional citation indexes, PlumX metrics also include clinical, patent and policy citations which are a nice measure of impact in society.
The Plum Print changes dynamically based on each artifact’s metrics. It is like a fingerprint – unique to the specific research output. You can see the color-coding and easily identify which of the 5 categories of metrics has the most or least impact.
Currently, the Plum Print and metrics are only available on the Public Health articles on the SciELO platform. We hope to have the widget throughout SciELO soon.
Clicking on the Plum Print takes you to the PlumX artifact page where you can see the detailed metrics, news posts, tweets, etc.
Currently we provide metrics from over 50 different types of sources, and these are just a few examples of what we are measuring. We have SciELO platform usage and citations.
Some metric sources update in near-real time, others daily or weekly.
We’ve built a PlumX Dashboard with all of the active SciELO articles organized by country and by subject area. You will have access to it to this dashboard in the very near future. We also have a guide that will walk you through how to use the PlumX Dashboard.
In the dashboard, you can filter by metric type, time period and more, or just sort columns to see which artifacts rise to the top. In this example, I looked at all of the articles published from 2013-2018, filtering only on the citation category. When I sort based on SciELO citations, this “Genes” articles rises to the top. Clicking on the article title from the list, takes me to the details page.
The artifact details page is where you can see all the metrics collected and any details we have (blog posts, news posts, tweets, Wikipedia referenes, etc.). When it comes to citation counts, you can see that this articles has citation counts from 3 sources – Scopus, SciELO and CrossRef. We know there is duplication among the three sources so to calculate the overall citation number for the article, we take the largest of the indexes to avoid artificially inflating the count. For the other 4 categories, we add up all of the different sources and counts.
Going back to the filtered list and sorting by Scopus this time, a different article rises to the top. However, you’ll notice that the article with the most SciELO citations has the second most Scopus citations.
Here’s the PlumX artifact page for the article with the most overall Scopus citations.
Also in the PlumX Dashboard, in the analytics area, you can benchmark each of the journals in a SciELO country (or subject area) in each of the 5 categories of metrics PlumX tracks. This helps you identify high performing journals, as well as journals that may be stuggling. Here, we’re looking at the metrics for Colombian journals from 2013-2018, the journal with the most usage is Revista de Salud Publica, while the journal with the most captures (indicators of future citations) is Universitas Psychologica.
The journal with the most overall Mentions (Revista de la Academia Colombiana de Ciencias Exactas, Físicas y Naturales) doesn’t have a lot of total artifacts but a much larger amount of mentions compared to other journals. Revista Colombiana de Anestesiología has the most social media mentions for this time period. Wondering what is driving the mentions, I can click on the bar graph to drill into the articles to see….
Wikipedia references are what drove the high mentions in this journal in this timeframe.
Looking at citations, Universitas Psychologica has the most citations of the Colombian journals. It also has the most artifacts.
Exporting these analytics will give you the details for all of the journals in the country – not just the 20 displayed in the report. This is helpful if you want to identify journals that could use some social media or public relations effort.
As I mentioned, you can also see how all of the SciELO journals perform in any subject area using the same Subgroup Overview report. We’re looking at the Earth Sciences group. The South African Journal of Science is really getting a lot of social media and mentions attention.
When I drill into the metrics, there is a single article driving both mentions and social media.
CiteScore metrics help you measure citation impact in a comprehensive and transparent way.
CiteScore Tracker provides you a way to see how your journal is performing in the current year.
CiteScore Rank & Trends lets you see how your journal ranks against other journals in the same category.
PlumX Metrics and the SciELO PlumX Dashboard gives you article-level metrics, organized by country/journal and subject area/journal.
The Subgroup Overview in the PlumX Dashboard allows you to benchmark by country, by journals in a country and by journals in a subject area, helping identify high and low performers and find insights about what is driving high performance.