Presentazione di Giuseppe De Nicolao al II Convegno Roars: “Higher Education and Research Policies in Europe: Challenges for Italy”, 21 febbraio 2014
CNR, Piazzale A. Moro 7, Roma
1. Le
poli(che
della
ricerca
al
tempo
dei
rankings
Giuseppe
De
Nicolao
Università
di
Pavia
2.
3. SOMMARIO
Prologo:
“mamma
li
Turchi!”
1. Dall’Egi=o
con
furore
2. Should
you
believe
in
the
Shanghai
ranking?
3. Numb3rs!
4. No
Rankings?
No
party.
5. The
power
of
numbers
6. Where
are
we
going?
6. «Siamo agli ultimi posti nelle
classifiche mondiali. Per questo
motivo presenteremo a novembre la
riforma dell’Università, [...]
Mi auguro di non dover più vedere in
futuro - conclude - la prima
università italiana al 174mo posto»
7. Chi
è
un
“highly
skilled
migrant”?
Dipende
dai
ranking
8. Highly
skilled
migrants
Can
I
become
a
highly
skilled
migrant
in
the
Netherlands
-‐
even
if
I
haven't
got
a
job
yet?
To
be
eligible,
you
must
be
in
possession
of
one
of
the
following
diplomas
or
cerPficates:
• a
master's
degree
or
doctorate
from
a
recognised
Dutch
insPtuPon
of
higher
educaPon
or
• a
master's
degree
or
doctorate
from
a
non-‐Dutch
insPtuPon
of
higher
educaPon
which
is
ranked
in
the
top
150
establishments
in
either
the
Times
Higher
EducaPon
2007
list
or
the
Academic
Ranking
of
World
UniversiPes
2007
issued
by
Jiao
Ton
Shanghai
University
in
2007
9. Posso
avere
una
borsa
per
un
master
o
un
PhD?
Dipende
dai
ranking
19. New
York
Times,
November
14,
2010
Alexandria’s
surprising
prominence
was
actually
due
to
“the
high
output
from
one
scholar
in
one
journal”
—
soon
idenPfied
on
various
blogs
as
Mohamed
El
Naschie,
an
EgypPan
academic
who
published
over
320
of
his
own
ar(cles
in
a
scien(fic
journal
of
which
he
was
also
the
editor.
23. • ...
of
the
400
papers
by
El
Naschie
indexed
in
Web
of
Science,
307
were
published
in
Chaos,
Solitons
and
Fractals
alone
while
he
was
editor-‐
in-‐chief.
• El
Naschie’s
papers
in
CSF
make
4992
cita(ons,
about
2000
of
which
are
to
papers
published
in
CSF,
largely
his
own.
26. THE
ranking
2012:
oops,
I
did
it
again!
• Only
a
few
of
the
more
than
100
co-‐authors
of
2008
and
2010
reviews
were
from
MEPhI
• The
2008
parPcle
physics
review
received
nearly
300
Pmes
as
many
citaPons
in
the
year
acer
publicaPon
as
the
mean
for
that
journal
• Cites
were
averaged
over
the
rela(vely
small
number
of
MEPhI’s
publicaPons
yielding
a
very
high
citaPon-‐rate
• Further,
if
citaPons
are
generally
low
in
their
countries,
then
insPtuPons
get
some
more
value
added
(regional
modificaAon)
28. the
top
10
most
spectacular
errors
of
…
reviewed
by
University
Ranking
Watch
29. QS
greatest
hits:
interna(onal
Students
and
Faculty
in
Malaysian
Universi(es
• In
2004
UniversiP
Malaya
(UM)
in
Malaysia.
reached
89th
place
in
the
THES-‐QS
world
rankings.
• In
2005
came
disaster.
UM
crashed
100
places
• PoliPcal
opposiPon:
shame
on
the
university
leadership!
• Real
explanaPon:
lot
of
Malaysian
ciPzens
of
Indian
and
Chinese
descent
erroneously
counted
as
“foreigners”.
30. QS
greatest
hits:
500
wrong
student
faculty
ra(os
in
2007
QS
Guide
• Someone
slipped
three
rows
when
copying
and
pasPng
student
faculty
raPos:
Dublin
InsPtute
of
Technology
was
given
Duke’s
raPo,
Pretoria
got
Pune’s,
Aachen
RWT
got
Aberystwyth’s
(Wales).
And
so
on.
Altogether
over
500
errors.
37. Should
you
believe
in
the
Shanghai
ranking?
An
MCDM
view
J.-‐C.
Billaut
D.
Bouyssou
P.
Vincke
• all
criteria
used
are
only
loosely
connected
with
what
they
intended
to
capture.
• several
arbitrary
parameters
and
many
micro-‐decisions
that
are
not
documented.
• flawed
and
nonsensical
aggregaPon
method
• «the
Shanghai
ranking
is
a
poorly
conceived
quick
and
dirty
exercise»
«any
of
our
MCDM
student
that
would
have
proposed
such
a
methodology
in
her
Master’s
Thesis
would
have
surely
failed
according
to
our
own
standards»
41. Twenty
Ways
to
Rise
in
the
Rankings
(1/3)
by
Richard
Holmes
hGp://rankingwatch.blogspot.it/2013/12/twenty-‐ways-‐to-‐rise-‐in-‐rankings-‐quickly.html
1.
Get
rid
of
students.
The
university
will
therefore
do
be=er
in
the
faculty
student
raPo
indicators.
2.
Kick
out
the
old
and
bring
in
the
young.
Get
rid
of
ageing
professors,
especially
if
unproducPve
and
expensive,
and
hire
lots
of
temporary
teachers
and
researchers.
5.
Get
a
medical
school.
Medical
research
produces
a
disproporPonate
number
of
papers
and
citaPons
which
is
good
for
the
QS
citaPons
per
faculty
indicator
and
the
ARWU
publicaPons
indicator.
Remember
this
strategy
may
not
help
with
THE
who
use
field
normalisaPon..
42. Twenty
Ways
to
Rise
in
the
Rankings
(2/3)
by
Richard
Holmes
hGp://rankingwatch.blogspot.it/2013/12/twenty-‐ways-‐to-‐rise-‐in-‐rankings-‐quickly.html
7.
Amalgamate.
What
about
a
new
mega
university
formed
by
merging
LSE,
University
College
London
and
Imperial
College?
Or
a
tres
grande
ecole
from
all
those
li=le
grandes
ecoles
around
Paris?
9. The
wisdom
of
crowds.
Focus
on
research
projects
in
those
fields
that
have
huge
mul(
-‐
“author”
publica(ons,
parPcle
physics,
astronomy
and
medicine
for
example.
Such
publicaPons
ocen
have
very
large
numbers
of
citaPons.
10.
Do
not
produce
too
much.
If
your
researchers
are
producing
five
thousand
papers
a
year,
then
those
five
hundred
citaPons
from
a
five
hundred
“author”
report
on
the
latest
discovery
in
parPcle
physics
will
not
have
much
impact.
43. Twenty
Ways
to
Rise
in
the
Rankings
(3/3)
by
Richard
Holmes
hGp://rankingwatch.blogspot.it/2013/12/twenty-‐ways-‐to-‐rise-‐in-‐rankings-‐quickly.html
13.
The
importance
of
names.
Make
sure
that
your
researchers
know
which
university
they
are
affiliated
to
and
that
they
know
its
correct
name.
Keep
an
eye
on
Scopus
and
ISI
and
make
sure
they
know
what
you
are
called.
18.
Support
your
local
independence
movement.
Increasing
the
number
of
internaPonal
students
and
faculty
is
good
for
both
the
THE
and
QS
rankings.
If
it
is
difficult
to
move
students
across
borders
why
not
create
new
borders?
20.
Get
Thee
to
an
Island.
Leiden
Ranking
has
a
li=le
known
ranking
that
measures
the
distance
between
collaborators.
At
the
moment
the
first
place
goes
to
the
Australian
NaPonal
University.
49. Classifiche
degli
atenei:
valore
scienAfico
assai
dubbio.
Come
si
può
misurare
il
peso
di
una
nazione
nel
panorama
scienAfico
internazionale?
Contando
gli
ar(coli
scien(fici
che
produce
e
le
citazioni
che
ques(
oGengono
50. Italia:
8°
per
ar(coli
scien(fici
Fonte:
SCImago
su
daP
Scopus
1996-‐2012
51. PUBBLICAZIONI (WoS)
100000
90000
80000
70000
Regno
Unito
60000
Giappone
Germania
Francia
50000
Canada
Italia
40000
Spagna
30000
Olanda
20000
Svezia
Svizzera
10000
0
1985
1990
1995
2000
2005
2010
52. 8
7
PUBBLICAZIONI 2004-2010:
CRESCITA MEDIA ANNUA (%)
6
5
4
3
2
1
0
-‐1
Fonte: VQR 2004-2010 – Rapporto Finale ANVUR, Giugno 2013 (Tab. 3.2)
(dati ISI Web of Knowledge, Thomson-Reuters)
http://www.anvur.org/rapporto/files/VQR2004-2010_RapportoFinale_parteterza_ConfrontiInternazionali.pdf
53. 6000000
5000000
4000000
PUBBLICAZIONI 2004-2010:
NUMERO DI CITAZIONI
3000000
2000000
1000000
0
Fonte: VQR 2004-2010 – Rapporto Finale ANVUR, Giugno 2013 (Tab. 4.1)
(dati ISI Web of Knowledge, Thomson-Reuters)
http://www.anvur.org/rapporto/files/VQR2004-2010_RapportoFinale_parteterza_ConfrontiInternazionali.pdf
57. Rapporto studenti/docenti: su 26
nazioni solo 5 stanno peggio di noi
26 countries
INDONESIA CZECH REP.
1
2
SLOVENIA
5
BELGIUM ITALY SAUDI ARABIA
4
6
3
61. NUMBER OF UNIVERSITIES
Quanto è elitaria la “top 500”?
OTHER 16,500
UNIVERSITIES
TOP 500
PERFORMANCE
... e cosa costa stare in cima?
62. MILIARDI
DI
EURO
LE
“OPERATING
EXPENSES”
DI
HARVARD
AMMONTANO
AL
44%
DEL
FONDO
DI
FINANZIAMENTO
DELL’INTERO
SISTEMA
UNIVERSITARIO
STATALE
ITALIANO
FONDO
FINANZIAMENTO
ORDINARIO
2012
OPERATING
EXPENSES
66. % di Atenei che entrano nei “top 500”
(Leiden: top 250)
CLASSIFICA:
Fonte
dei
daK:
“Malata
e
denigrata
:
l’universita
italiana
a
confronto
con
l’Europa”
(a
cura
di
M.
Regini,
Roma,
Donzelli
2009)
68. Niente
classifica?
Niente
valutazione.
“Ogni
valutazione
deve
me[ere
capo
a
una
classifica.
Questa
è
la
logica
della
valutazione.
Se
c'
è
una
classifica,
non
c'
è
neanche
una
reale
valutazione”
Giulio
TremonP,
“Il
passato
e
il
buon
senso”
CdS
22-‐08-‐08
69.
70. ma
come
fanno
veramente
gli
inglesi
?
Per
rispondere
andiamo
alle
fon(
(la
“VQR
inglese”)
71. NO RANKINGS PLEASE!
WE’RE ENGLISH!!
“RAE2008 results are in the form of
a
quality
profile
for
each
submission made by an HEI
[Higher Education Institution]. We
have not produced any ranked lists
of single scores for institutions or
Units of Assessment, and nor do
we intend to.”
77. Rankings
•
•
•
•
Fragile
scienPfic
grounds
IncenPve
to
gaming
Raw
data
are
obscured
They
are
not
necessary
to
manage
funding
(see
RAE/REF)
Why,
then?
82. Aggregators
vs
non-‐aggregators
(3/3)
• Aggregators:
value
in
combining
indicators:
extremely
useful
in
garnering
media
interest
and
hence
the
a[enKon
of
policy
makers
• Non-‐aggregators:
key
objecKon
to
aggregaKon:
the
arbitrary
nature
of
the
weighKng
process
by
which
the
variables
are
combined
83.
84. Germany
• “We
look
back
decades
and
people
came
to
German
universiKes;
today
they
go
to
US
universiKes.”
• The
Exzellenzini(a(ve
(2005):
from
tradiPonal
emphasis
on
egalitarianism
towards
compePPon
and
hierarchical
stra(fica(on
85. France
• The
Shanghai
ranking
“generated
considerable
embarrassment
among
the
French
intelligentsia,
academia
and
government:
the
first
French
higher
educaKon
insKtuKon
in
the
ranking
came
only
in
65th
posiAon,
mostly
behind
American
universiKes
and
a
few
BriKsh
ones”
86. Australia
• The
SJT
and
QS:
at
least
two
Australian
universiPes
among
the
top
100.
• Opposing
strategic
opPons:
– fund
a
small
number
of
top-‐Per
compePPve
universiPes
– “creaPon
of
a
diverse
set
of
high
performing,
globally-‐focused
insPtuPons,
each
with
its
own
clear,
dis(nc(ve
mission”.
87. Japan
• “The
government
wants
a
first
class
university
for
internaKonal
presKge
”
• “in
order
for
Japanese
HEIs
to
compete
globally,
the
government
will
close
down
some
regional
and
private
universiKes
and
direct
money
to
the
major
universiAes”
• some
insPtuPons
will
become
teaching
only.
90. E.
Hazelkorn
on
rankings
• 90
or
95%
of
our
students
do
not
a=end
elite
insPtuPons.
Why
are
we
spending
so
much
on
what
people
aren’t
aGending
as
opposed
to
what
they
are
a=ending?
• EsPmated
yearly
budget
of
€1.5
billion
to
be
ranked
in
the
world’s
top
100.
May
detract
resources
from
pensions,
health,
housing,
....
• Are
“elite”
insPtuPons
really
driving
naPonal
or
regional
economic
and
social
development?
91. Does
trickle-‐down
work?
“Governments and universities must stop obsessing about
global rankings and the top 1% of the world's 15,000
institutions. Instead of simply rewarding the achievements
of elites and flagship institutions, policy needs to focus on
the quality of the system-as-a-whole.”
There is little evidence
that trickle-down works.
93. Where
are
we?
• (Even)
Phil
Baty
(Times
Higher
EducaKon)
admits
that
there
are
aspects
of
academic
life
where
rankings
are
of
li=le
value
• Can
we/you
afford
the
‘reputaPon
race’?
• We
will
have
to
live
in
a
world
in
which
extremely
poor
rankings
are
regularly
published
and
used.
What
can
be
done
then?
94. What
can
be
done
then?
• There
is
no
such
thing
as
a
‘‘best
university’’
in
abstracto.
• Stop
talking
about
these
‘‘all
purpose
rankings’’.
They
are
meaningless.
• Lobby
in
our
own
insPtuPon
so
that
these
rankings
are
never
men(oned
in
insPtuPonal
communicaPon
• Produce
many
alterna(ve
rankings
that
produce
vastly
different
results.