This is an updated version of an invited talk I presented at the European Research Council-Brussels (Scientific Seminar): "Love for Science or 'academic prostitution'".
It has been updated to be presented at the The Spanish and Portuguese Relativity Meetings (EREP) on 6th July 2019.
I have included new slides and revised others.
I present a personal revision (sometimes my own vision) of some issues that I consider key for doing Science. It was at the time focused on the expected audience, mainly Scientific Officers with background in different fields of science and scholarship, but also Agency staff.
Abstract: In a recent Special issue of Nature concerning Science Metrics it was claimed that " Research reverts to a kind of 'academic prostitution' in which work is done to please editors and referees rather than to further knowledge."If this is true, funding agencies should try to avoid falling into the trap of their own system. By perpetuating this 'prostitution' they risk not funding the best research but funding the best sold research.
Given the current epoch of economical crisis, where in a quest for funds researchers are forced into competitive game of pandering to panelists, its seems a good time for deep reflection about the entire scientific system.
With this talk I aim to provoke extra critical thinking among the committees who select evaluators, and among the evaluators, who in turn require critical thinking to the candidates when selecting excellent science.
I present some initiatives (e.g. new tracers of impact for the Web era- 'altmetrics'), and on-going projects (e.g. how to move from publishing advertising to publishing knowledge), that might enable us to favor Science over marketing.
Pests of mustard_Identification_Management_Dr.UPR.pdf
Love for science or Academic prostitution, 2019 update
1. Love for Science
or
Academic ‘Prostitution’?
LourdesVerdes-Montenegro
Instituto de Astrofísica de Andalucía (CSIC)
@mundos55
EREP, 6th July 2019, Valencia
2. NGC 5216: Keenan's System by Winder/Hager
Reinvent?
Environment and galaxies Large sample
Di!erent wavelengths
E"cient search of data
Can’t reproduce!
Analysis tools
What should we publish?
Sharing Research
MOTIVATION
4. MOTIVATION
2013
Research reverts to a kind of 'academic
prostitution', in which work is done to
please editors and referees rather than to
further knowledge.
June 2010
5. The fine art of salami publishing
Outline
• Any problem with the Scientific Method?
• Metrics of research vs Open Science
• Economy?
• Academic “prostitution”
• Alternatives
• What’s next and conclusions
7. ONLY SCIENCE
We seem to separate:
- excellent science
- quality science
- science that we can trust
- open science
- etc
Science = Scientific Method = Reproducible = Open
8. ANY PROBLEM WITH THE SCIENTIFIC METHOD??
Scientific Reproducibility is a fundamental principle of the
Scientific Method, a process established in the 17th century that
marked the beginning of modern science and laid the foundations
for the Philosophy of Science.
We all agree,“reproducibility is great!”, right?
Then… What is the problem?
9. ANY PROBLEM WITH THE SCIENTIFIC METHOD??
Scientific Reproducibility is a fundamental principle of the
Scientific Method, a process established in the 17th century that
marked the beginning of modern science and laid the foundations
for the Philosophy of Science.
We all agree,“reproducibility is great!”, right?
Then… What is the problem?
Paradoxically:
•part of the scientific community claims that reproducibility is
already achieved (a section describing their methods in their
papers, they share their data, most recently notebooks covering
part of the analysis)
•the remainder mostly consider it a utopy
10. Questionnaire on reproducibility (1500
scientists)
•70% of researchers have tried and
failed to reproduce another scientist's
experiments
• > 50% have failed to reproduce their
own ones!
• Chemistry: 90% (60%)
• Biology: 80% (60%)
• Physics and engineering: 70% (50%)
• Medicine: 70% (60%)
• Earth & environmental science: 60% (40%)
ANY PROBLEM WITH THE SCIENTIFIC METHOD??
Actually, YES!
CHALLENGES IN
IRREPRODUCIBLE RESEARCH
2016
11. Questionnaire on reproducibility (1500
scientists)
•70% of researchers have tried and
failed to reproduce another scientist's
experiments
• > 50% have failed to reproduce their
own ones!
• Chemistry: 90% (60%)
• Biology: 80% (60%)
• Physics and engineering: 70% (50%)
• Medicine: 70% (60%)
• Earth & environmental science: 60% (40%)
ANY PROBLEM WITH THE SCIENTIFIC METHOD??
Actually, YES!
CHALLENGES IN
IRREPRODUCIBLE RESEARCH
2016
Aha! So you don’t
empathise?
12. Questionnaire on reproducibility (1500
scientists)
•70% of researchers have tried and
failed to reproduce another scientist's
experiments
• > 50% have failed to reproduce their
own ones!
• Chemistry: 90% (60%)
• Biology: 80% (60%)
• Physics and engineering: 70% (50%)
• Medicine: 70% (60%)
• Earth & environmental science: 60% (40%)
ANY PROBLEM WITH THE SCIENTIFIC METHOD??
Actually, YES!
CHALLENGES IN
IRREPRODUCIBLE RESEARCH
2016
Maybe with this?
Aha! So you don’t
empathise?
14. /
CURRENT METRICS = OPEN
... “Science is being killed by numerical
ranking,”[...] Ranking systems lures
scientists into pursuing high rankings first
and good science second
Productivity seems to
prevail over Discovery
June 2010
15. YES, WE ARE SENSIBLE TO RANKINGS
In January 2014, the journal Psychological Science (PSCI)
introduced badges for articles with open data.
16. Citations:
• Simple way to denote influence
• Hard to compare between fields or career stages
Impact factor:
• In 2005, 89% of Nature’s impact factor was
generated by 25% of the articles
17. Reputation and Impact in Academic Careers
(Petersen et al PNAS 43, 111, 2014)
Goal: to better understand the role of social ties, author reputation,
and the citation life cycle of individual papers
•author reputation dominates in the initial phase of a papers citation
life cycle --> papers gain a significant early citation advantage if
written by authors already having high reputations in the scientific
community.
CITATIONS
Is peer review any good? (Casati et al 2009)
•Rankings of the review process vs impact (citations):
Very little correlation
•Peer review filters out papers that are most likely to have impact:
Not confirmed
18. Exploring and Understanding Scientific Metrics in Citation
Networks (Krapivin et al 2009)
Citation counts Citation counts
PaperRank
PaperRank
CITATIONS
19. CITATIONS
“Remains of Holocene giant pandas from Jiangdong Mountain (Yunnan,
China) and their relevance to the evolution of quaternary
environments in south-western China”
(by Jablonski et al. and published in Historical Biology)
“A quick look at the actual
conversations about the paper
reveal that it was Figure 7, not the
research content of the paper, that
attracted all of the attention”
Jean Liu, 2013, Who loves Pandas?
20. PLOS (Public Library of Science) (November 2012)
Richard Cave at the Charleston Conference 2012, Charleston
Citations represent less than 1% of usage for an
article.
CITATIONS
21. ... an author's h-index can reflect longevity as much as
quality — and can never go down with age, even if a
researcher drops out of science altogether.
24. ECONOMY?
“What has economics to do with science?
economics is about understanding how human beings behave
when one or more resources are scarce”
Blog M Nielsen 2008
People pushed to apply for grants
25. Examples of advices to improve chances of getting a grant:
•title of the project counts 50%
•proposals circulated at the home institution
OK that sounds fun, but what does it reflect?
“Evaluators don’t have time to read in detail proposals”
“Evaluators are not experts, so if your full institute can follow and find it
attractive a typical evaluator will”
ECONOMY?
26. R. Brooks (Univ. New South Wales)
Economy has a bad influence in:
• Candidates: pushed to get funds
• Funders: expensive to get enough experts during enough time
hence in Science
ECONOMY?
27. • Senior advice to young scientists: go to the most prestigious journal
“OPTING FOR OPEN ACCESS MEANS CONSIDERING COSTS, JOURNAL
PRESTIGE AND CAREER IMPLICATIONS”
STEPHEN PINCOCK, 2013. NATURE, 495, 539
ECONOMY?
29. Evaluator of yearly review of FP7
EC STREP project:
“There are people who are
paying other researchers to get
their papers cited, so as to
increase their h-index”
30. Marketing for Scientists is a Facebook group, a blog, a workshop, and
a book published by Island Press, meant to help scientists build the
careers they want and restore science to its proper place in society.
Sometimes, unlocking the mysteries of the universe just isn't enough.
Marketing
34. •Science works through micro improvements and multiple errors
and failures until something finally works
•We’ve become paralyzed with the notion that showing incremental
improvements and corrections hurts, rather than helps, our personal
careers and science.
Who Killed the PrePrint, and Could It Make a Return?
By Jason Hoyt and Peter Binfield
CAVEATS
Can excellence kill Science?
Such metrics further block innovation because they encourage
scientists to work in areas of science that are already highly populated,
as it is only in these fields that large numbers of scientists can be
expected to reference one’s work, no matter how outstanding.
Science Editorial, 17 May 2013
By Bruce Alberts, Science Editor’s in chief
37. NEW PUBLICATION METHODS
Authority and expertise are central in the Web era as they were in the journal era.
The difference is that whereas the paper-based system used subjective criteria to
identify authoritative voices, theWeb-based one assesses authority recursively from
the entire community.
J. PRIEM, 2013. NATURE, 495, 437
38. Reputation:
a (very) rough measurement of how much the MathOverflow community
trusts you.
never given, earned by convincing other users that you know what you're
talking about.
•good question or helpful answer: voted up by peers: 10 points
•off topic or incorrect: voted down: -2 points.
NEW PUBLICATION METHODS
39. A blog that reports on retractions of scientific papers
(Ivan Oransky - executive editor of Reuters Health - and Adam Marcus
- managing editor of Anesthesiology News)
Aim: to increase the transparency of the retraction process:
retractions of papers generally are not announced, and the reasons for
retractions are not publicized.
40. Open access, peer-reviewed, promotes discussion of results:
•unexpected, controversial, provocative and/or negative
•that challenge current models, tenets or dogmas.
•illustrate how commonly used methods and techniques are unsuitable
for studying a particular phenomenon.
Not all will turn out to be of such groundbreaking significance.
However, we strongly believe that such "negative" observations and
conclusions, based on rigorous experimentation and thorough
documentation, ought to be published in order to be discussed,
confirmed or refuted by others.
42. Euroscience feedback on Open access and Plan S (Feb 2019)
non-profit grassroots association of researchers in Europe, that since 1997,
has been active in shaping policies for science, technology and innovation (STI)
OPEN ACCESS
•Supporter of Open Access and of the principles of Plan S.
•Proposed implementation speed is doubtful
•Implementation scenarios are likely differ from field to field
43. OPEN ACCESS
•Supporter of Open Access and of the principles of Plan S.
•Proposed implementation speed is doubtful
•Implementation scenarios are likely differ from field to field
•Evaluation practices for early stage researchers need to change: almost
completely determined by publishing substantial numbers of articles in journals
with a high impact factor
• Challenge for both funders and research organisations to meet in parallel
to changing the publication models
• Evaluation needs to take into account all the considerable work necessary
to enable useful sharing of publications and data, such as documenting
usable metadata and making the data FAIR
This will require time and pilots
Euroscience feedback on Open access and Plan S (Feb 2019)
non-profit grassroots association of researchers in Europe, that since 1997,
has been active in shaping policies for science, technology and innovation (STI)
45. Is NOT a release early, instead
of peer review model.
Treat research as software:
release notes & version management
46. Abelard and Héloise: Why Data and Publications
Belong Together
Eefke Smit (International Association of STM Publishers: members
collectively publish nearly 66% of all journal articles) D-Lib Magazine, 2011
•Journals to require availability of underlying research material as an
editorial policy
•Ensure data is stored, curated and preserved in trustworthy places
•Ensure links (bi-directional) and persistent identifiers between data
and publications
•Establish uniform citation practices of data
BEYOND THE PDF
+ and methods (software, etc, see later)
47. BEYOND THE PDF
• Knowledge Burying in paper publication
(S. Bechhofer 2011, Research Objects:Towards Exchange and Reuse of Digital Knowledge)
- Publishing/mining cycle results in loss of knowledge
>= 40% of information lost
- RIP: Rest In Paper
http://www.clipartkid.com/rip-cliparts/
48. • “The academic paper is now obsolescent...
... as the fundamental sharable description of a piece of research. In the
future we will be sharing some other form of scholarly artefact,
something which is digital and designed for reuse and to drop easily
into the tooling of e-Research, [...]
These could be called Knowledge Objects or Publication Objects or
whatever: I shall refer to them as Research Objects, because they
capture research “
(De Roure 2009, Director of Oxford’s e-Research Centre)
BEYOND THE PDF
• Knowledge Burying in paper publication
(S. Bechhofer 2011, Research Objects:Towards Exchange and Reuse of Digital Knowledge)
- Publishing/mining cycle results in loss of knowledge
>= 40% of information lost
- RIP: Rest In Paper
http://www.clipartkid.com/rip-cliparts/
Moving from narratives (last 300 yrs)
to the actual output of research
49. Wf4Ever (Workflows forever) project
Astronomy Use Case
Preservation of the methods
•Investigates and develops technological infrastructure for the
preservation and efficient retrieval and reuse of scientific
workflows
•Introduced the concept of a Research Object, containing the
artefacts needed to interpret or reconstruct research
EU FUNDED FP7 STREP PROJECT
DECEMBER 2010 – DECEMBER 2013
50. ? ! ?
Define
problem
Inspection
Design
Modify
REUSE
Research Object: software +
• documentation,
• input and output examples,
• annotations (human/machine readable)
• metadata (both for data & methods): software
version, configuration parameters, execution
environment, description of main steps in the
methods, etc
Lifecycle of an experiment
Find for re-purpose
Expose the
methodology
54. Open Science** implementation will facilitate sharing data, resources
and tools across the community.The methods can be verified, reused,
repurposed, so accelerating discovery and transfer of knowledge
BARRIERS TO REPRODUCIBLE RESEARCH
Bottom-up
Scientists = we want to follow the Scientific method
** Open Science is transparent and accessible knowledge that is shared and developed
through collaborative networks.
55. Open Science** implementation will facilitate sharing data, resources
and tools across the community.The methods can be verified, reused,
repurposed, so accelerating discovery and transfer of knowledge
BARRIERS TO REPRODUCIBLE RESEARCH
Bottom-up
Scientists = we want to follow the Scientific method
It is not simple
•Evaluation practices for early stage researchers need to change: almost
completely determined by publishing substantial numbers of articles in journals
with a high impact factor, etc
Euroscience feedback on Open access and Plan S (Feb 2019)
56. PERSPECTIVES
@mundos55
Implementation of Open, reproducible science is challenging, even
more in this new framework:
NEW ROLES NEW PERSPECTIVES
Individual users
Large
international
teams
Publishers
Evaluators/
Funding agencies
Service
providers
(Data/science
centres)
57. PERSPECTIVES
@mundos55
Implementation of Open, reproducible science is challenging, even
more in this new framework:
NEW ROLES NEW PERSPECTIVES
Individual users
Large
international
teams
Publishers
Evaluators/
Funding agencies
This is about reuse
and trust
This is about metrics
of science
This is about
many things
Service
providers
(Data/science
centres)
This is about
technology
58. Publishers
• Will we need different profiles of referees to evaluate the scientific discussion
together with the data quality and the methods (aka. Reproducibility)?
• If the data and the methods (tools) will be in Science/Data Centres, will our
referees need to become a “user” of the Data Centres to be able to validate a
paper?
• Will we be able to engage so many referees as may be needed?
• Will we need to validate the data, the tools, and the scientific analysis separetely?
The challenge of going “beyond
the PDF”
PERSPECTIVES
59. The misuse of the journal impact factor is highly destructive, inviting a gaming of the metric
that can bias journals against publishing important papers in fields [...] that are much less cited
than others. And it wastes the time of scientists by overloading highly cited journals such as
Science with inappropriate submissions from researchers who are desperate to gain points
from their evaluators.
Editorial, Science 2013
BOTTOM-UP
• MetricTide, Leiden Manifesto, etc
60. Altmetrics is the creation and study of new metrics based
on the Social Web for analyzing, and informing scholarship.
ALTMETRICS
Bulletin of the American Society for Information
Science andTechnology 39, 4 (2013)
61. •In the Web era, scholarship leaves footprints.
•The flow of scholarly information is expanding by orders of magnitude,
swamping our paper-based filtering system
J. PRIEM, 2013. NATURE, 495, 437
is the creation and study of
new metrics based on the
Social Web for analyzing, and
informing scholarship.
ALTMETRICS
63. Indicators for funding bodies of recent research (a large number of
downloads, views, plays...):
how open and accessible scientists are making their research
Strongly recommend altmetrics be considered not as a replacement but
as a supplement for careful expert evaluation:
to highlight research products that might otherwise go unnoticed
Alternative metrics are thought to free researchers
from conventional measures of prestige
STEPHEN PINCOCK, 2013. NATURE, 495, 539
Bulletin of the American Society for Information
Science andTechnology 39, 4 (2013)
ALTMETRICS
64. Head of digital services at
the Wellcome Trust Library
(one of the world's major resources
for the study of medical history):
Policies of the UK programme for
assessing research quality, the
Research Excellence Framework:
no grant-review sub-panel “will
make any use of journal impact
factors, rankings, lists or the
perceived standing of publishers in
assessing the quality of research
outputs”
TOP-DOWN
65. • How to measure science output:
• data in any format (tables, images, etc)
• algorithms
• analysis tools
• NSF example:
• Chapter II.C.2.f(i)(c), Biographical
Sketch(es), has been revised to rename
the “Publications” section to “Products”
and amend terminology and instructions
accordingly.This change makes clear that
products may include, but are not limited
to, publications, data sets, software,
patents, and copyrights.
•To make it count, however, it needs to
be both citable and accessible.
http://datapub.cdlib.org
TOP-DOWN
66. Not just citation of articles, various
forms of social media shares, web-
downloads, any other measure of the
Q and impact of research outcomes
Thematic Reports:
•Types
•Use in the context of Open Science
•Incentives and Rewards
•Strategies, Experiences and Models
•Final Report
March 2017
April 2018
TOP-DOWN
67. Policy makers / funding agencies
• How to measure reproducibility?
• How to weight it and/or aggregate with other indicators?
• Is it affordable / sustainable?
Reproducibility as a key element of
Open Science
PERSPECTIVES
69. Total collecting area equivalent to 1 square kilometre
•Thousands of antennas with different technologies (frequency range)
• Separated by thousands of kilometres (finesse in details)
• Omnidirectional antennas and software pointing (surveys, transients)
THE SQUARE KILOMETRE ARRAY
3 sites == 2 telescopes + HQ. 1 Observatory
SKA1-low = Australia
50 - 350 MHz ,Baselines 65 km
SKA1-mid = South Africa
350 MHz - 14GHz
Baselines 150 km
70. SKA will be the greatest data research public project
•Massive data transport, storage and high speed
computing
•Power challenge: central region = 100.000 inhab. city
Dr. Fabiola Gianotti, CERN DG & Prof. Philip
Diamond, SKA DG 14th July 2017
SKA and CERN
Cooperative Agreement
The Challenge: Extraction of scientific knowledge
•Data products not in the final state for science analysis
•Direct delivery to end users is unfeasible
•International distributed scientific teams
SKA GOES BEYOND ASTRONOMY
SKA Regional Centres (SRCs), to be accredited by the SKA Observatory
73. INTERNATIONAL INITIATIVES DEVELOPING SRCS
Material developed for AENEAS, funded by EC Horizon 2020 grant 731016
•SKA pathfinder/precursors telescopes: crucial for preparatory works
•SKA Regional Centre Coordination Group (2017-2019)
• SKA Regional Centre Steering Committee (2019- )
ERIDANUS in Asia-Pacific
IDIA in South Africa
CIRADA in Canada
AENEAS in Europe
74. EuroScience Open Forum (ESOF) July 2018:
● Session proposed to SKA Office for “Theme #3 Science policy and transformation of
research practice“, focused on reproducible science and new metrics in the era of
Megascience infrastructures, accepted by SKAO, and submitted in collaboration
FOLLOW-UP
75. ● Submitter and Manager: William Garnier
(SKAO)
● Moderator: May Chiao (Chief Editor Nature
Astronomy)
● Keynote speakers
● LourdesVerdes-Montenegro (IAA-CSIC)
● Panellists:
● Carole Goble (Univ. of Manchester, working with many different scientific communities
towards reproducible research)
● Sebastian Neubert (Univ. of Heidelberg,Worldwide LHC Supercomputing Grid)
● Jeff Dozier (Univ. of California, applies reproducibility in climate change studies)
● René Von Schomberg (Leads the EC Open Science policy coordination and development
team)
● Antonio Chrysostomou (SKAO)
FOLLOW-UP
EuroScience Open Forum (ESOF) July 2018
76. •Wavelength agnostic facility
•Access through IVOA services - SRC data will be IVOA compliant
•Tools for reuse of SKA data products (multi-messenger, multi-
wavelength)
•Share infrastructure with other disciplines
•Provide a collaborative software platform that can interoperate
among SRCs
AIMS
The core of
SKA science SKA Regional Centres
77. CONCLUSIONS
“Within a culture that pressures scientists to produce rather than
discover, the outcome is a biased and impoverished science in which most
published results are either unconfirmed genuine discoveries or
unchallenged fallacies.This observation implies no moral judgement of
scientists, who are as much victims of this system as they are perpetrators”
Chambers et al 2014,AIMS Neuroscience 1,4, 2014 “Instead of playing the
game it is time to change the rules”
Is this what we want to “reproduce”?
Will be forget about reproducibility since we need to
“efficiently” exploit large datasets?
78. CONCLUSIONS
“Within a culture that pressures scientists to produce rather than
discover, the outcome is a biased and impoverished science in which most
published results are either unconfirmed genuine discoveries or
unchallenged fallacies.This observation implies no moral judgement of
scientists, who are as much victims of this system as they are perpetrators”
Chambers et al 2014,AIMS Neuroscience 1,4, 2014 “Instead of playing the
game it is time to change the rules”
Is this what we want to “reproduce”?
Will be forget about reproducibility since we need to
“efficiently” exploit large datasets?
Unless we are ready to change the way in which we, the scientists,
work, there is no guarantee that the quality of Science will improve.