This presentation was provided by William Gunn of Elsevier during the NISO Virtual Conference, Advancing Altmetrics, held on Wednesday, December 13, 2017.
7. You have to be strategic
• Choose the best people
• Don’t spread it too thin
• Coordinate expertise to review
• Not to narrow either
• Advance your mission
• Insulate yourself from
accusations of
mismanagement, bias, &
favoritism.
https://twitter.com/Kingwole/status/826485113731047426
8. You have to be strategic
• Choose the best people
• Don’t spread it too thin
• Coordinate expertise to review
• Not too narrow either
• Advance your mission
• Insulate yourself from
accusations of
mismanagement, bias, &
favoritism.
9. You have to be strategic
• Choose the best people
• Don’t spread it too thin
• Coordinate expertise to review
• Not too narrow either
• Advance your mission
• Insulate yourself from
accusations of
mismanagement, bias, &
favoritism.
10. You have to be strategic
• Choose the best people
• Don’t spread it too thin
• Coordinate expertise to review
• Not too narrow either
• Advance your mission
• Insulate yourself from
accusations of
mismanagement, bias, &
favoritism.
11. You have to be strategic
• Choose the best people
• Don’t spread it too thin
• Coordinate expertise to review
• Not too narrow either
• Advance your mission
• Insulate yourself from
accusations of
mismanagement, bias, &
favoritism.
12. You have to be strategic
• Choose the best people
• Don’t spread it too thin
• Coordinate expertise to review
• Not too narrow either
• Advance your mission
• Insulate yourself from
accusations of
mismanagement, bias, &
favoritism.
13. You have to be strategic
• Choose the best people
• Don’t spread it too thin
• Coordinate expertise to review
• Not too narrow either
• Advance your mission
• Insulate yourself from
accusations of
mismanagement, bias, &
favoritism.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24. Using source data 1996 -
present
~900 million citing-cited pairs
~40 million source docs
~35 million cited non-indexed
docs
Calculated relatedness for 900
million pairs
Result: ~96,000 topics
25. Prominence combines 3 metrics to
indicate the momentum of the
topic
• Citation Count in year n to
papers published in n and n-1
• Scopus Views Count in year n to
papers published in n and n-1
• Average CiteScore for year n
26. Grant data (255k grants, $165 billion, 6
years) from the U.S. STAR METRICS
database were assigned to topics using
textual similarity
• Limited to NIH and NSF for this study
• STAR METRICS data are publicly
available
Prominence explains 37% of the variance
Prominence + Funding (2008-2010) together
explain 71% of the variance
29. Reproducibility Issues
Re-tested 70+ drugs from 221 independent studies 1
➜ 0 reproduced
➜ Minocycline: effective in four separate ALS mouse
studies worsened symptoms in a clinical trial of more
than 400 patients 2
Sponsored replication of 12 spinal cord injury studies
➜ 2/12 fully reproduced 3
Conducted in-house target validation studies
➜ 14/67 reproduced 4
Attempted to reproduce 53 “landmark” oncology
publications
➜ 6/53 reproduced 5
1. Scott et al. Amyotroph Lateral Scler. 9, 4-15 (2008).
2. Gordon et al. Lancet Neurol. 6, 1045–1053 (2007).
3. Stuart et al. Experimental Neurology 233, 597–605 (2012).
4. Prinz et al. Nat Rev Drug Discov. 10, 712 (2011).
5. Begley and Ellis. Nature. 483, 531-3 (2012).
30. Grant data (255k grants, $165 billion, 6
years) from the U.S. STAR METRICS
database were assigned to topics using
textual similarity
• Limited to NIH and NSF for this study
• STAR METRICS data are publicly
available
Prominence explains 37% of the variance
Prominence + Funding (2008-2010) together
explain 71% of the variance
31. Prominence combines 3 metrics to
indicate the momentum of the
topic
• Citation Count in year n to
papers published in n and n-1
• Scopus Views Count in year n to
papers published in n and n-1
• Average CiteScore for year n
32. Measurements: citations, number of publications, co-
authors, institutional stature, social media activity
Phenomenon: A researcher becomes highly
influential, winning grants and influencing policy.
33.
34. Different metrics for different scholarly outputs
“We don't sign a first baseman on the speed of his fastball,
nor should we evaluate every scholar, journal, or
department based solely on their citation scores. Whether
it's teaching, outreach, readership, provoking discussion,
sharing software and data, or providing great feedback,
altmetrics help us pay attention to … these important
areas". - Jason Priem, Impact Story