1. Ranking of Universities: Methods, Limitations, Challenges Middle East Technical University Ankara, December 3, 2010 Anthony F.J. van Raan Center for Science and Technology Studies (CWTS) Leiden University
2. Leiden University, oldest in the Netherlands, 1575, European League of Research Universities (LERU) Leiden, historic city (2th, 11 th C.), strong cultural (arts, painting) & scientific tradition one of the largest science parks in EU
3.
4.
5. First and major challenge: - Establish as good as possible the research performance in the recent past in quantifiable terms (objectivity, reliability); - If you succeed in quantifying performance, you can always make a ranking
6. Two methodological approaches: (1) ‘broader’ peer review > expert survey > reputation assessment (2) bibliometric indicators > performance and impact measurement On the department or research program level (1) and (2) are strongly correlated
7. Conceptual problems peer review/surveys : * Slow: ‘old’ reputation vs. performance at the research front now * Mix of opinions about research, training, teaching, political/social status
8. Conceptual problems bibliometric analysis : 1. Evidences of research performance such as prizes and awards, earlier peer review, invitations as key-note speakers, etc. are not covered However, in many cases these ‘non-bibliometric’ indicators correlate strongly with bibliometric indicators 2. Focus on journal articles: differences across disciplines
9. Citing Publications Cited Publications From other disciplines From emerging fields From research devoted to societal, economical and technological problems From industry From international top-groups These all f(t)!! > Sleeping Beauties
17. Enhance visibility of engineering, social sciences and humanities by proper publication-density normalization Size: World average Science as a structure of 200 related fields World map
20. F(t1) F(t2) Knowledge dynamics described by a continuity equation describing the change of a quantity inside any region in terms of density and flow ?
21. University Departments Fields ‘ bottom-up’ analysis: input data (assignment of researchers to departments) necessary; > Detailed research performance analysis of a university by department ‘ top-down’ analysis: field-structure is imposed to university; > Broad overview analysis of a university by field
22.
23.
24. Indicators divided into 5 categories, with fraction of final ranking score Teaching: 0.30 Research: 0.30 Citations: 0.325 Industry income: 0.025 International: 0.05 Fractions are based on ‘expert enthusiasm’ and confidence in data’
25.
26.
27.
28. Indicators divided into 5 categories, with fraction of final ranking score Academic Reputation 0.40 Employer Reputation: 0.10 Stud/staff: 0.20 Citations: 0.20 International: 0.10
29.
30.
31. Indicators divided into 5 categories, with fraction of final ranking score Nobel Prize Alumni/Awards 0.10 + 0.20 HiCi staff, N&S, PUB: 0.20 + 0.20 + 0.20 PCP (size normaliz.) 0.10 Citations only in HiCi (but citation per staff measure)!
32.
33.
34. Leiden Ranking Spring and Autumn 2010 500 Largest Universities Worldwide 250 Largest Universities in Europe Spring 2010: 2003-2008(P)/2009(C) Autumn 2010: 2003-2009 (P and C)
35. Main Features of Leiden Ranking 2010: * Best possible definition of universities by applying the CWTS address unification algorithm * New indicators next to the standard CWTS indicators, including university-industry collaboration (autumn) * Field-specific ranking and top- indicators (15 major fields) > Benchmarking
36. P=2004-2008, C=2004-2009 top-250 largest ranked by CPP CPP/ CPP MNCS2 P FCSm UNIV LAUSANNE CH 12.23 1.49 6950 1.55 UNIV DUNDEE UK 11.82 1.42 4413 1.51 LONDON SCH HYGIENE & TROPICAL MEDICINE UNIV LONDON UK 11.59 1.64 4232 1.62 JG UNIV MAINZ DE 11.41 1.32 5015 1.25 UNIV OXFORD UK 11.25 1.67 26539 1.63 UNIV CAMBRIDGE UK 11.15 1.70 25662 1.63 KAROLINSKA INST STOCKHOLM SE 11.09 1.36 16873 1.34 UNIV BASEL CH 11.02 1.55 8483 1.46 ERASMUS UNIV ROTTERDAM NL 11.01 1.49 12408 1.49 UNIV GENEVE CH 10.90 1.45 9342 1.47 UNIV COLL LONDON UK 10.85 1.48 26286 1.46 IMPERIAL COLL LONDON UK 10.47 1.55 21967 1.49 UNIV SUSSEX UK 10.33 1.60 3841 1.67 UNIV EDINBURGH UK 10.25 1.54 13188 1.54 UNIV ZURICH CH 10.16 1.46 13824 1.44 LEIDEN UNIV NL 10.02 1.43 12513 1.37 QUEEN MARY COLL UNIV LONDON UK 9.92 1.44 4586 1.45 UNIV HEIDELBERG DE 9.71 1.35 15445 1.32 STOCKHOLM UNIV SE 9.55 1.43 6427 1.50 VRIJE UNIV AMSTERDAM NL 9.51 1.40 12201 1.40 HEINRICH HEINE UNIV DUSSELDORF DE 9.50 1.29 6636 1.25 UNIV DURHAM UK 9.44 1.69 5848 1.65 ETH ZURICH CH 9.41 1.63 15099 1.64 UNIV GLASGOW UK 9.34 1.41 10435 1.45 KINGS COLL UNIV LONDON UK 9.33 1.38 13680 1.33 UNIV SOUTHERN DENMARK DK 9.30 1.29 4786 1.34 MED HOCHSCHULE HANNOVER DE 9.29 1.22 5233 1.16 LMU UNIV MUNCHEN DE 9.23 1.38 16995 1.30 UNIV AMSTERDAM NL 9.18 1.41 15492 1.36 UNIV BORDEAUX II VICTOR SEGALEN FR 9.16 1.24 4354 1.22
37.
38. Large, Broad European University Focus: top 25 % in publication output and citation impact Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25%
39. Smaller Specialized European University Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25% Specialized in Economy and related fields Among top 25 % in citation impact, but in the lower-50% of publication output ECON MATH PSYCH
40. Based on this updated and extended ranking approach: Benchmarking studies of universities Comparison of the ‘target’-university with 20 other universities at choice:
65. Figure 1 Relation between CPP/FCSm and MNCS 1 for 158 research groups Figure 2 Relation between CPP/FCSm and MNCS 2 indicator for 158 research groups
66. Figure 3 Relation between CPP/FCSm and MNCS 1 for the 365 largest universities in the world Figure 4. Relation between CPP/FCSm and MNCS 2 for the 365 largest universities in the world
67. Figure 5 Relation between CPP/FCSm and MNCS 1 for the 58 largest countries Figure 6 Relation between CPP/FCSm and MNCS 2 for the 58 largest countries
68. Figure 7 Relation between CPP/FCSm and MNCS 1 for all WoS (nonAH) journals Figure 8 Relation between CPP/FCSm and MNCS 2 for all WoS (nonAH) journals
69. Only journals with CPP/FCSm and MNCS1 < 2.5 Figure 9 Relation between CPP/FCSm and MNCS 1 for all WoS (nonAH) journals Figure 10 Relation between CPP/FCSm and MNCS 2 for all WoS (nonAH) journals
70. Figure 11 Relation between CPP/FCSm and MNCS 1 for 190 researchers (large UMC) with > 20 WoS publications 1997-200 6 (citations counted up to 200 6 ). Figure 12 Relation between CPP/FCSm and MNCS 2 for 190 researchers (large UMC) with > 20 WoS publications 1997-200 6 (citations counted up to 200 8 ). 0 1 2 3 4 5 0 1 2 3 4 5 A 0 1 2 3 4 5 0 1 2 3 4 5
72. Application of Thomson-WoS Impact Factors for research performance evaluation is irresponsible * Much too short citation window * No field-specific normalization * No distinction between document types * Calculation errors/inconsistencies nominator /denominator * Underlying citation distribution is very skew: IF-value heavily determined by a few very highly cited papers
73.
74.
75.
76.
77.
78.
79. CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole 1996 2005
80. CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole A: 46%, half of which 0 cit B: 10% C: 15% D: 10% E: 19% 1996 2005
82. CPP/FCSm < 0.80: performance significantly below internat. average, class A; 0.80 < CPP/FCSm < 1.20: performance about internat. average, class B; 1.20 < CPP/FCSm < 2.00: performance significantly above internat. average, class C; 2.00 < CPP/FCSm < 3.00: performance in internat. perspective is very good, class D; CPP/FCSm > 3.00: performance in internat. perspective is excellent, class E.
84. Neuroscience Cancer research Genomics Clinical research Cardio-vascular 2.0 1.0 3.0 0.0 CPP/FCSm P Same institute, now breakdown into its 5 departments world average
85.
86. cluster Field = set of publications with thematic/field-specific classification codes again for new, emerging often interdisc. fields scientific fine-grained structure
87. Mesh delineation vs. journal-classification Problem of the ‘right’ FCSm….. FCSm FCSm ISI j-category based PubMed classification based
88.
89.
90.
91. Finding 1: Size-dependent cumulative advantage for the impact of universities in terms of total number of citations. Quite remarkably, lower performance universities have a larger size-dependent cumulative advantage for receiving citations than top-performance universities.
92.
93.
94. Finding 2: For the lower-performance universities the fraction of not-cited publications decreases with size. The higher the average journal impact of a university, the lower the number of not-cited publications. Also, the higher the average number of citations per publication in a university, the lower the number of not-cited publications. In other words, universities that are cited more per paper also have more cited papers.
95.
96. Finding 3: The average research performance of university measured by crown indicator CPP/FCSm does not ‘dilute’ with increasing size. The large top-performance universities are characterized by ‘big and beautiful’. They succeed in keeping a high performance over a broad range of activities. This is an indication of their overall scientific and intellectual attractive power.
97.
98.
99.
100.
101. Finding 4: Low field citation density and low journal impact universities have a size-dependent cumulative advantage for the total number of citations. For lower-performance universities, field citation density provides a cumulative advantage in citations per publication. Top universities publish in journals with higher journal impact as compared to lower performance universities. Top universities perform a factor 1.3 better than bottom universities in journals with the same average impact.
102.
103. Finding 5: The fraction of self-citations decreases as a function of research performance, of average field citation density, and of average journal impact.