Rankings don't affect EU university strategies
After our August summer break Research Europe is back in business. We will publish our next issue on 15 September, brimming with research policy news from across Europe. To give you a taster of what is to come, our guest author James Brooks is looking at international rankings, and what they mean to EU universities.
The publication of the 2011 Shanghai Jiaotong rankings last month showed what is already common knowledge – over the past years European universities have not managed to profoundly improve their position in international academic rankings.
We asked the universities of Sheffield, Stockholm, Utrecht and Zurich, who made it into the top 100, if they set much store by their placing. Just how important is a good placing nowadays? Crucially, are their strategies influenced by a desire to climb the table?
The short answer to that last question is “no”. The longer answer, exemplified by Kåre Bremer, president and vice chancellor of Stockholm University, which has climbed seven places over the last two years, is that institutions “would in any case promote activities leading to results improving the indicators used”.
Those indicators are all research-related, with an emphasis on science and medicine. For example, Shanghai’s assessment of quality of education is based on the number of alumni who go on to win Nobel Prizes or Fields Medals. This makes it hard to set a strategy to climb the rankings.
But even those at universities benefitting from the focus on scientific research can be ambivalent about Shanghai and similar ranking systems. Katrien Maes, policy officer at the League of European Research Universities, says that for some institutions rankings “throw a monkey-wrench into how universities are seen, how their role in society is perceived”.
“Sometimes it goes too much towards this call on universities to be providers of direct economic benefit to societies—the total sum of what universities do is not easily equated to a number of things that can be measured,” she says.
Lesley Wilson, secretary general of the European University Association, talks favourably of the efforts made by the pilot EU ranking project U-Multirank to compile a ranking system that goes beyond assessing research output and includes measures for things like quality of teaching.
Wilson says that “it’s very difficult, actually, to find the data that will give you a clear indication of the quality of teaching. It’s easier in a national context but the state of the art in terms of comparable data available on these issues across countries inside Europe is not wonderful”.
Rebecca Hughes, pro vice-chancellor international at the University of Sheffield, agrees that international rankings are of limited validity. Sheffield itself has dropped 15 places in the Shanghai rankings over the last 5 years. But the fall, says Hughes, is not reflective of any diminished research output.
“European universities have raised their game and therefore there’s a sort of jockeying for position in that top 100 – it’s very tight,” she says. “It’s not necessarily an indicator of a drop-off.”
Yet Hughes, like Wilson, supports a proliferation of ranking systems. “The more rankings you have that are transparent and look at different aspects of excellence, the better,” she says. “If you’ve got to have rankings, having lots of different rankings is not a bad thing.”
If you would like to read more from James, check out his blog my last nerve.