A presentation by Rachel Bruce, director open science and research lifecycle, Jisc and Matthew Spitzer, community manager, Centre for Open Science (COS).
1. Improving Transparency, Integrity, and
Reproducibility of Scholarly Research
a 501(c)3
Matthew Spitzer
Center for Open Science
http://cos.io/
@OSFramework
@matthewspitzer
17. Put data, materials, and code on the OSF
Check out “Fish Guy’s” Story: http://www.wired.com/2016/01/print-an-army-of-giant-articulated-fish-from-this-3-d-database/
Make it Easy…
20. https://osf.io/preprints
Community Driven approach to
Scholarly Communication
requires a custom approach that
represents the interest of the
community
Any group can launch and manage a
fully functional service for their
community, including use of
• custom domains
• custom taxonomies
• brand identity
• editorial guidelines
21.
22. https://osf.io/preprints/discover
But also search across multiple preprint
repositories
Powered by SHARE (share.osf.io), OSF
Preprints aggregates search across local and
external preprint services
Currently over 2M preprint records available
12,000+ preprints currently hosted directly on the OSF
across 20+ services
23. WHAT IF?
Funders incentivized open science?
Institutions recognized open science contributions?
Journals encouraged/supported discovery of open
science outputs?
Make it Rewarding & Required…
24. Transparency & Openness Guidelines
https://cos.io/our-services/top-guidelines/
1. Data citation
2. Design transparency
3. Research materials transparency
4. Data transparency
5. Analytic methods (code)
transparency
6. Preregistration of studies
7. Preregistration of analysis plans
8. Replication
Signatories
5,000+ Journals
60+ Organization
Open Practices Badges
Making Behaviors Visible Promotes Adoption
https://cos.io/our-services/open-science-badges/
Make it Rewarding & Required…
25. Where should I publish to get recognized for my Open contribution?
• Add Data and Materials
sharing policies
• Add corresponding or partner
preprint repositories
• Recognize moderated
preprint repositories for REF
requirements?
Make it Rewarding…
27. 1. Suggest the OSF to research teams that need solutions
for research collaboration and sharing
2. Explore using OSF Institutions to improve visibility,
preservation, and curation at your institution
3. Share or post preprints and post-prints http://osf.io/preprints/
4. Suggest/email editors that their policies matter and to
implement TOP cos.io/top
https://osf.io/x695s/
matt.spitzer@cos.io
Hinweis der Redaktion
Non-profit in Charlottesville, VA
Our mission…
But what we really do is build public-goods infrastructure to enable this behavior by researchers, to actualize these as values by communities of researchers, and to incentives these as policies by stakeholders of the research and scholarly communications process.
Entirely grant and foundation funded
Examples from:
Button et al – Neuroscience
Ioannidis – why most results are false (Medicine)
GWAS
Biology
Two possibilities are that the percentage of positive results is inflated because negative results are much less likely to be published, and that we are pursuing our analysis freedoms to produce positive results that are not really there. These would lead to an inflation of false-positive results in the published literature.
Some evidence from bio-medical research suggests that this is occurring. Two different industrial laboratories attempted to replicate 40 or 50 basic science studies that showed positive evidence for markers for new cancer treatments or other issues in medicine. They did not select at random. Instead, they picked studies considered landmark findings. The success rates for replication were about 25% in one study and about 10% in the other. Further, some of the findings they could not replicate had spurred large literatures of hundreds of articles following up on the finding and its implications, but never having tested whether the evidence for the original finding was solid. This is a massive waste of resources.
Across the sciences, evidence like this has spurred lots of discussion and proposed actions to improve research efficiency and avoid the massive waste of resources linked to erroneous results getting in and staying in the literature, and about the culture of scientific practices that is rewarding publishing, perhaps at the expense of knowledge building. There have been a variety of suggestions for what to do. For example, the Nature article on the right suggests that publishing standards should be increased for basic science research.
[It is not in my interest to replicate – myself or others – to evaluate validity and improve precision in effect estimates (redundant). Replication is worth next to zero (Makel data on published replications; motivated to not call it replication; novelty is supreme – zero “error checking”; not in my interest to check my work, and not in your interest to check my work (let’s just each do our own thing and get rewarded for that)
Irreproducible results will get in and stay in the literature (examples from bio-med). Prinz and Begley articles (make sure to summarize accurately)
The Nature article by folks in bio-medicine is great. The solution they offer is a popular one in commentators from the other sciences -- raise publishing standards.
There’s more to it than sharing of discrete objects. Think about using this as an opportunity to increase transparency by capturing the entire workflow, and to do so while connecting tools and services that make up the parts of the workflow, not requiring people to change all of their practices at once, and providing immediate efficiencies and value to the researcher AS they comply with requirements.
Easy, right? Obviously not.
Switching costs
TIME 15 (10)
The features I’ve noted are nice and important for improving openness and reproducibility, but there are many other tools and services that researchers use as well. Our strategy is to accept that many tools are used and to connect them through the OSF.
By connecting the workflow, we can all provide researchers with significant gains in automation, efficiency, and reproducibility. Immediate value from this reinforces more changes.
quite simply, it is a file repository that allows any sort of file to be stored and many types to be rendered in the browser without any special software. This is very important for increasing accessibility to research.
quite simply, it is a file repository that allows any sort of file to be stored and many types to be rendered in the browser without any special software. This is very important for increasing accessibility to research.
Supporting these behavioral changes requires improving the full scientific ecosystem.
At a conference like CNI, there are many people in the room contributing important parts to this ecosystem. I hope you leave this talk seeing the potential for how we might be able to work together on connecting tools to provide for better transparency and reproducibility in the workflow.