TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
Unveiling the Ecosystem of Science: How can we characterize and assess diversity of profiles in science
1. 1
Unveiling the Ecosystem
of Science
How can we characterize and assess
diversity of profiles in science
Nicolas Robinson-Garcia
Delft Institute of Applied Mathematics
TU Delft
3. 3
Methodological design for individuals’ assessment
Combining experts’ judgments and
metrics in research evaluation
Application of Structured Expert
Judgment using valuation model
Definition of a valuation model
Motivations and values of scientists
to promote diversity in science
Multiple case-study analysis (CV,
bibliometric data and interviews)
Testing “diversity”
Effects of task specialization on
research careers
Based on bibliometric data and
contribution statements
Outline
4. 4
Methodological design for individuals’ assessment
Combining experts’ judgments and
metrics in research evaluation
Application of Structured Expert
Judgment using valuation model
Definition of a valuation model
Motivations and values of scientists
to promote diversity in science
Multiple case-study analysis (CV,
bibliometric data and interviews)
Testing “diversity”
Effects of task specialization on
research careers
Based on bibliometric data and
contribution statements
Outline
5. 5
Why a valuation model?
Current individual assessments are…
• … based on the notion of excellence
and efficiency or career prospects
6. 6
Why a valuation model?
Current individual assessments are…
• … based on the notion of excellence
and efficiency or career prospects
• … considered in isolation
and considering that science is a team game
7. 7
Why a valuation model?
Current individual assessments are…
• … based on the notion of excellence
and efficiency or career prospects
• … considered in isolation
and considering that science is a team game
• … conducted based on universal criteria
and considering context or needs
8. 8
Why a valuation model?
Critiques to individual assessment
• DORA – Impact Factor
• Leiden Manifesto – Misuse of indicators
• Metric Tide – Indicators are not adequate
Threats from current research evaluation
1. Pervasive effects on the scientific workforce (De Rijcke et al., 2016; Milojevic et
al., 2018)
2. Pervasive effects on knowledge production (Nosek et al., 2015; Sarewitz, 2016)
9. 9
Why a valuation model?
Critiques to individual assessment
• DORA – Impact Factor
• Leiden Manifesto – Misuse of indicators
• Metric Tide – Indicators are not adequate
Threats from current research evaluation
1. Pervasive effects on the scientific workforce (De Rijcke et al., 2016; Milojevic et
al., 2018)
2. Pervasive effects on knowledge production (Nosek et al., 2015; Sarewitz, 2016)
10. 10
Alternative models of evaluation
• S&T Human Capital model (Bozeman et al. 2001)
• Shift from ‘outputs’ to ‘capacities’
• Too vague to operationalize consistently
• ACUMEN portfolio
• Distinction between expertise, outputs and impacts
• Unclear how to put into practice
• Evaluative Enquiry approach (Fochler & De Rijcke, 2017)
• Emphasis on context
• Designed for improvement, but not distribution
12. 12
Evaluative dimensions
Set of activities which are valued in an individual’s
CV, namely related with:
• Background
• Scientific work
• Social engagement
• Capacity to attract resources and train new scholars
• …
13. 13
Evaluative dimensions
Scientific engagement
Publications
Peer review activities
…
Social engagement
Service
Outreach
…
Background
International experience
Non-academic experience
…
Capacity building
Personal gain: grants, awards, resources…
Institutional gain: Funding, resources…
Level of openness
Transparency in research practices
Accessibility and reuse of outputs
14. 14
Evaluative dimensions
Scientific engagement
Publications
Peer review activities
…
Social engagement
Service
Outreach
…
Background
International experience
Non-academic experience
…
Capacity building
Personal gain: grants, awards, resources…
Institutional gain: Funding, resources…
Level of openness
Transparency in research practices
Accessibility and reuse of outputs
15. 15
Evaluative dimensions
Scientific engagement
Publications
Peer review activities
…
Social engagement
Service
Outreach
…
Background
International experience
Non-academic experience
…
Capacity building
Personal gain: grants, awards, resources…
Institutional gain: Funding, resources…
Level of openness
Transparency in research practices
Accessibility and reuse of outputs
CREDIT
20. 20
In progress… CASE STUDIES
• How do reported activities of scientists fit within this model?
• How do dimensions and activities relate to each other?
• Should we account for diversity within dimensions?
• How do activities relate with seniority?
• Why do scientists perform activities which a priori are not
considered in assessment?
• Are these “other” activities actually valued?
21. 21
Next steps
Implementing an experimental assessment with
Structured Expert Judgment to:
• Test evaluators’ perceptions on value of scientists
• Test between statements and real practice
• Test capacity of adaptation to different needs and roles
• Potential application?