by Ben Sowter
The THE-QS World University Rankings have been in existence now for five editions, the 2009 release will be the sixth. In some way, major or minor, the methodology has changed for each release…
Add employer review component
Collect full data on 200 additional universities
Switch from 10 years to 5 years for citations measure
Switch from ESI to Scopus for citations
Adopt new normalisation methodology
Insist upon FTE for all personnel metrics
Peer reviewers unable to select own institution
Separate int’l and domestic responses to surveys
Increased response levels to surveys
Tweaks to definitions
New institutions added to study
A ranking is a complex operation and the data available has evolved over time, as has our understanding of it. As we receive feedback and additional metrics become available the responsible thing to do is to integrate new developments with a view to improving the evaluation, making it more meaningful and insightful. The effects of these developments are vivible and reasonalby well-documented – on our website www.topuniversities.com you can find results going back to 2005.
Recently we have been doing some work on a communication project for a British university. Their leadership is concerned about the conclusions their governing body may infer from the results of rankings and have asked us to make a presentation to explain, in simple terms, some of the shortfalls of rankings and what a change in position might actually mean. In conducting this work, not only did we discover that the two major domestic rankings in the UK are subject to similarly profound “evolutions” in methodology, but that they also seem to be comparatively unforthcoming with their historical results.
On the first point, the introduction and further development of the National Student Survey has had a dramatic influence on results in 2008 and 2009. On the second, the only way we were able to track results over more than the last two editions was to purchase second hand copies of the associated books from Amazon and re-key all the results manually. Similalry the US News rankings seems not to clearly reveal results before the current year. In contrast, both the THE-QS and Shanghai Jiao Tong rankings provide results over a number of years.
Whilst, given the ongoing changes in methodology , it might be misleading to conduct detailed trend analysis over time, the Berlin Principles suggest that transparency is a key expectation for a responsibly conducted rankings. Surely that should include the complete history of a ranking and not simply the most recent edition.