Ciaran O'Neill & Amye Kenall: Peering into review - Innovation, credit & reproducibility. Talk 1 in the "What Bioinformaticians need to know about digital publishing beyond the PDF2" workshop at ISMB 2014, Boston, 16th July 2014
5. 5
“our goal is unapologetically ambitious:
to establish a new system of peer review
to bolster productive scientific debate
and to provide scientists with useful guides to the literature”
Launch Editorial: Eugene Koonin, David Lipman, Laura Landweber
15. 15
1. Ioannidis et al., (2009). Repeatability of published microarray gene expression analyses. Nature Genetics 41: 14
2. Ioannidis JPA (2005) Why Most Published Research Findings Are False. PLoS Med 2(8)
Out of 18 microarray papers, results
from 10 could not be reproduced
22. 22
• Repository of standardised
and annotated
multielectrode array data
from mice and ferrets
• 366 recordings from 12
studies
• Authors submitted in knitr
• Aided review process,
allowing reviewers to rerun
analyses
• Authors reported it saved
time—having a “natural
record” of what you did
• Automatic updating of text
you might overlook (figure
legends, eg)
23. 23
Some testimonials for Knitr
Authors (Wolfgang Huber)
“I do all my projects in Knitr. Having the textual
explanation, the associated code and the results all in one
place really increases productivity, and helps explaining my
analyses to colleagues, or even just to my future self.”
Reviewers (Christophe Pouzat)
“It took me a couple of hours to get the data, the few
custom developed routines, the “vignette” and to
REPRODUCE EXACTLY the analysis presented in the
manuscript. With few more hours, I was able to modify the
authors’ code to change their Fig. 4. In addition to making
the presented research trustworthy, the reproducible
research paradigm definitely makes the reviewer’s job
much more fun!