This document provides an overview of open access and emerging networks of open research. It discusses the open access publishing landscape and growth of open access. It explores what "being open" means, including reducing friction around peer review, metrics, data sharing, and reproducibility. It addresses how to maximize the number of people research can reach by making it more open and managing the transition to open access. Key points include the importance of openness in solving large problems, measuring impact through article-level metrics rather than journal impact factors, and ensuring access to original data and documents.
Russian Call Girls Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service...
Maccallum
1. Open Access and emerging
networks of Open Research
UHMLG Spring Forum
London
28 February 2014
Catriona MacCallum
PLOS Advocacy Project Manager
& Consulting Editor (ONE); OASPA BoD
2. The landscape
– The OA publishing landscape, PLOS
– The growth of Open Access
What does ‘Being Open’ mean?
Adapting to the Network
Reducing Friction
– Research Assessment & Peer review
– Article Level Metrics
– Data sharing
– Reporting
– Reproducibility
Maximising N
– How Open Is It?
– Managing the transition to OA
2
3. More Open Access Policies
Mandates for Access to Research Emerge Across the Globe
3
Map: www.openaccess.org
4. Open Access Publishing Landscape
OA Publishers
• BioMedCentral (Springer)
• PLOS
• Frontiers
• Hindawi
• Copernicus
• CoAction Publishers
(Humanities)
• OpenBook Publishers
• Ubiquity Press
Subscription Publishers with OA
options
• E.g. Oxford University
Press, Wiley-Blackwell,
Nature Publishing Group
• AAAS and The Royal
Society (last week)
4
5. 5
About PLOS
PLOS is a nonprofit publisher and advocacy organization
founded to accelerate progress in science and medicine by
leading a transformation in research communication.
• Ten years old
• The largest not-for-profit Open Access publisher (~ 3000
publications per month)
• The publisher of 7 peer-reviewed Open Access journals
• Based in San Francisco, US, and Cambridge, UK
• Business model: Article Processing Charge (waiver system)
• Self-sustaining since late 2010
6. What is Open Access ?
Free Availability and Unrestricted Use
Free access – no charge to
access
No embargos – immediately
available
Reuse – Creative Commons
Attribution License (CC BY) to
use with proper attribution
6
7. PLOS Open Access Journals
7
• PLOS Biology
works of exceptional significance in all areas of biological science
• PLOS Medicine
leading open access medical journal
• PLOS Genetics
outstanding original contributions in all areas of genetics
and genomics
• PLOS Computational Biology
new insights into living systems at all scales
• PLOS Pathogens
new ideas that contribute to understanding the biology of pathogens
• PLOS Neglected Tropical Diseases
forgotten diseases affecting the world’s forgotten people
• PLOS ONE
world’s largest scientific journal, covering all of science
$1350
‘Flagships’
• Highly Selective
• Professional &
Academic Editors
$2900
Community journals
• Highly Selective
• Academic Editors only
• Similar to Society Journals
$2250
8. Key Innovation:
the editorial process
8
• Editorial criteria
• Scientifically rigorous
• Ethical
• Properly reported
• Conclusions supported by the data
• Editors and reviewers do not ask
• How important is the work?
• Which is the relevant audience?
• Everything that deserves to be published, will be published
• Therefore the journal is not artificially limited in size
• Use online tools to sort and filter scholarly content after publication, not
before
17. My work can help someone...
http://www.flickr.com/photos/mararie/3313582639/ CC-BY-SA
18. P =
Interest
Friction
x N(help someone)
Neylon C (2013) Architecting the Future of Research Communication: Building the Models and
Analytics for an Open Access Future. PLoS Biol 11(10): e1001691.
doi:10.1371/journal.pbio.1001691
Published October 22, 2013
19. P =
Interest
Friction
x N(help someone)
Proportion that could
use your work
Usability of your work
Number of people
you can reach
20. Someone out there can help...
http://www.flickr.com/photos/mararie/3313582639/ CC-BY-SA
21. P =
Interest
Friction
x N(getting help)
Proportion that create
work that you can use
Ease of contributing
Number of people
you can reach
29. A network of literature and data
Document
Database
and people
30. Torres-Sosa et al, doi:10.1371/journal.pcbi.1002669 Waters et al, doi:10.1371/journal.pone.0040337
http://bit.ly/bosc-cn-fig1 http://bit.ly/bosc-cn-fig2
31. Torres-Sosa et al, doi:10.1371/journal.pcbi.1002669 Waters et al, doi:10.1371/journal.pone.0040337
http://bit.ly/bosc-cn-fig1 http://bit.ly/bosc-cn-fig2
32. 32
Connection Probability (the inverse of friction)
Size(nodes)
Many small
unconnected
networks
Large
interconnected
networks
Fig 1. (Adapted) Neylon C (2013) doi:10.1371/journal.pbio.1001691 Code: https://gist.github.com/cameronneylon/603336
36. “Without openness across global digital
networks, it is doubtful that large and complex
problems in areas such as economics, climate
change and health can be solved.”
Martin Hall, Chair of Jisc and vice-chancellor of the University of Salford
The Guardian 18th Feb 2014
“It's time senior leaders made openness – and
its consequences – their concern”
37. Accelerating Science Awards Program (ASAP)
Global Collaboration
to Fight Malaria
Matthew Todd, PhD
Visualizing Complex Science
Daniel Mietchen, PhD, Raphael Wimmer
and Nils Dagsson Moskopp
HIV Self-Test
Empowers Patients
Nitika Pant Pai, MD, MPH, PhD,
Caroline Vadnais, Roni Deli-Houssein
and Sushmita Shivkumar
37
http://asap.plos.org
38. Adapting to the Network
Reducing friction
38
• Research Assessment & Peer Review
• Article Level Metrics
• Data
39. Is the communication trail fit for purpose?
39Cartoon by Nick Kim (non-commercial reuse & image mustn’t be altered) http://www.strange-
matter.net/screen_res/nz060.jpg
40. Can Scientists Assess Merit or Predict Impact?
• Analysed subjective rankings of papers from two different data
sets over five years
– Faculty of 1000
– Welcome Trust (data from Allen et al. of 2 assessor rankings
within 6 months of publication)
– In relation to citations and impact factor
40
Eyre-Walker A, Stoletzki N (2013) The Assessment of Science:
The Relative Merits of Post-Publication Review, the Impact
Factor, and the Number of Citations. PLoS Biol 11(10):
e1001675. doi:10.1371/journal.pbio.1001675
http://www.plosbiology.org/article/info:doi/10.1371/journal.pb
io.1001675
41. Subjective assessments of science are poor:
• Very weak correlation between assessors
• strongly biased by the journal in which the paper was published
Scientists are also poor at predicting the future impact:
• Because they are not good at assessing merit
• Similar articles accumulate citations essentially by chance.
41
“What this paper shows is that whatever merit might be, scientists can't be
doing a good job of evaluating it when they rank the importance or quality of
papers. From the (lack of) correlation among assessor scores, most of the
variation in ranking has to be due to ‘error’ rather than actual quality
differences.”
Carl Bergstrom , 2013
Eisen JA, MacCallum CJ, Neylon C (2013) Expert Failure: Re-evaluating Research Assessment. PLoS Biol 11(10):
e1001677. doi:10.1371/journal.pbio.1001677
42. Can Scientists Assess Merit or Impact
• Last R.A.E. cost the UK Govt £60 million – what are the
assessors adding?
• Multiple assessors don’t make much difference
• Number of citations or the impact factor exaggerates
differences between papers
• Assessor bias could affect e.g. the ranking of universities,
tenure, etc
42
Abandon subjective ratings of articles?
43. New Peer Review and Publication Strategies
• Pre-print servers
– PeerJ PrePrints, arXiv, bioRxiv
• Open peer review (signed and published reviews)
– Copernicus, BMJ Open
• Reviewers know each other’s identities and comment on each other’s
reviews
– eLIFE
• Reviewers comment on each other’s reviews, but remain anonymous to
each other
– EMBO
• Post-publication assessment
– F1000 Research, Frontiers, PLOS Open Evaluation1, PubMed Commons2
• Independent peer-review services
– Rubriq, Axios, Peerage of Science
1http://www.ploslabs.org/openevaluation/
2http://www.ncbi.nlm.nih.gov/pubmedcommons/
44. Who cares about
measuring research
impact?
Researchers
(authors and
readers)
Publishers
Funders The public
Librarians
Institutions
45. Problems with using journal IF as a measure of article
quality or impact:
• Citation distributions within journals are highly skewed
• Inclusion of highly diverse article types, including both research
articles and reviews
• The IF can be manipulated/gamed by journal editorial policy
• Data used to calculate the IF are not transparent nor openly
available to the public
45http://am.ascb.org/dora/
46. • A worldwide initiative, spearheaded by the ASCB (American Society for Cell
Biology), together with scholarly journals and funders
• Focuses on the need to improve the way in which the outputs of scientific
research are evaluated:
– the need to eliminate the use of journal-based metrics, such as Journal Impact Factors, in
funding, appointment, and promotion considerations;
– “need to assess research on its own merits rather than on the basis of the journal in
which the research is published”
46http://am.ascb.org/dora/
DORA: Declaration on Research Assessment
47. So, how can we measure ‘impact’?
http://article-level-metrics.plos.org
A suite of established metrics
measures overall performance
and reach of published research articles
PLOS Article-Level Metrics (2009)
Martin Fenner
(PLOS technical lead)
53. From: How Does the Availability of Research Data Change With Time Since Publication? Timothy H. Vines and colleagues, Abstract
(podium), Peer Review Congress, 2013
53
54. Transparency: it’s not just access to data
that’s a problem
54
• Bias (common)
• Misreporting (common)
• Spin (common)
• Misconduct (thought rare)
– Falsification
– Fabrication
– Plagiarism
– Violation of ethical standards
– Other types of misconduct
How can these
be addressed?
May occur at
all types of
journals – OA,
or otherwise
55. Being open around data
55
• Raise reporting standards
– CONSORT, ARRIVE (EQUATOR)
• Improve access to original datasets
• Ensure access to historical documents eg protocols –
ensure what has been reported can be compared against
what was planned
• Incentivise reproducibility of original studies
• Eisen JA, Ganley E, MacCallum CJ (2014) Open Science and
Reporting Animal Studies: Who's Accountable? PLoS Biol 12(1):
e1001757. doi:10.1371/journal.pbio.1001757
• The PLOS Medicine Editors (2013) Better Reporting of Scientific
Studies: Why It Matters. PLoS Med 10(8): e1001504.
doi:10.1371/journal.pmed.1001504
56. 56
PLOS will require a data-sharing statement in all papers
• Data underling findings
• Describes where and how data can be accessed
• Restrictions allowed for e.g. patient confidentiality
• Published statement prominent on first page
• Data accessible in a recognized, stable repository
PLOS Data Policy (Theo Bloom, PLOS Biology Editorial Director)
Access to original Datasets Look for the PLOS
Data Policy on
March 1, 2014
57. Adapting to the Network
Maximising N
57
• How Open Is It?
• Managing the transition to OA
58. HowOpenIsIt?
Not all Open Access is created equal
58
Open Access Spectrum
• Recognizes 6 components that
define Open Access publications
• Defines what makes a journal more
open vs. less open
• Invites informed decisions about
where to publish
A collaboration among:
60. Open Access Census
• Tool to generate reports on the Open
Access status for a set of articles
• Search a database for articles or upload
DOIs
• Determine OA status relevant to
reporting required (by grant, by
institution etc)
61. Search for Papers belonging to Grant XXX in PubMed
Software Development: Ana Nelson
Data compiled by Cameron Neylon, 2014
62. Managing the transition
• Minimise costs
– Price transparency
– Avoid replacing big ‘subscription deals’ with big ‘APC’ deals
– Discourage double dipping
– A mixture of repositories and OA journals
• Foster Competition
– Effective markets, differentiated products, differentiated business
models
• Effective Collaboration
– OA at the heart of policy making
– Working for a coherent global policy agenda
– Monitoring Compliance and Reporting
– Interoperability between platforms
62
63. How do you stay afloat?
http://www.flickr.com/photos/nationallibrarynz_commons/4078337883 Public Domain
76. 76NASA, ESA, and the Hubble SM4 ERO Team: http://hubblesite.org/newscenter/archive/releases/2009/25/image/f/format/large_web/
what we can’t yet imagine
…to discover