QS.com Asian University Rankings due to provide insight on a larger number of indicators

by Ben Sowter

 

It has been encouraging to see traffic on this fledgling blog spike today in anticipation of the QS.com Asian University Rankings due for publication tomorrow. It has been a very busy time responding to individual institutions and preparing our press campaign. The methodology is somehat different from the THE-QS World University Rankings, with a smaller number of countries we have been able to gather adequate data on a couple of additional indicators – the internationalisation area now features inbound and outbound exchange numbers; whilst the citations per faculty indicator has been split out into papers per faculty (productivity) and citations per paper (quality).

Additionally, the regional exercises emphasises the performance differences between institutions in the region – particularly in research measures where the presence of US institutions significantly compresses the scale.

All this means there may be a few small surprises tomorrow when the results are published. Results and more detail on the methodology will emerge initially through Chosun Ilbo (www.chosun.com), our partner in South Korea and will follow at 6.00AM GMT on our website – www.topuniversities.com.

I will try and find time later in the week to put together a more complete post looking at some of the results and some of the interesting contrasts between the results of this exercise and those of the world rankings. I also look forward to reading and responding to any comments about the methodology or results – we’re always interested in feedback and providing a balanced view.

Obama on (higher) education…

by Ben Sowter

 

We are living in interesting times. The world is in economic chaos, we are under the persistent threat of terrorism and now there is also a pestilence. Those prone to drama could be forgiven for suggesting that the four horsemen are abroad.

Not a time to adopt the presidency of the United States then. Or is it?

Greatness is generally measured by one’s achievements and their contrast against those of our peers. Political achievement, like customer service or IT support is rarely observed when there is nothing to fix. Obama has come to power at a time when there is much to repair with his closest peer and predecessor having been arguably amongst the worst presidents in history (US News & World Report). With that backdrop in mind, it is perhaps no surprise that his first 100 days seem to have been broadly chalked up as a success.

But what is it all going to mean for higher education, both domestically and globally?

University rankings acknowledge the US to have a good lead but also suggest that this is being, albeit slowly, eroded. The strategies the US pursues are clearly both deeply influential on and closely scrutinized by the global higher education sector.

In late October last year, we ran a seminar circuit in North America visiting the University of Toronto, Boston University, Columbia University, UC San Diego, UC Berkeley and the University of Chicago. Predictably the topic was the background, methodology and results of the THE- QS World University Rankings. It was an exciting time to be in the US – I was in Chicago the weekend before Obama’s victory party – and I was trawling through the candidates’ manifestos to seek out references to higher education that might contextualise the content of our sessions. There was very little – higher education is clearly not much of a campaign gambit at present. Anything I could find was talking about widening participation, increasing diversity and provided fiscal support to enable more people from less-privileged backgrounds to make it to college or university – certainly an important agenda, but not necessarily one that gave much of a lead on support directly for the institutions to imporve their ability to educate and further their pursuit of basic science.

Still, it doesn’t take a tempered political analyst to recognise that what gets one elected can be different from what needs to be done, or indeed, what will maintain the support of the people.

It seems certain that, given his background and repeated rhetoric to this point, that Obama considers education one the central priorities of his administration, in his speech before Congress on February 24 he identified education as one of three pillars of long-term economic recovery and specifically stated, “our children will compete for jobs in a global economy that too many of our schools do not prepare them for”. It seems that the international exposure of future generations of American graduates and, ultimately, leaders is at the forefront of the President’s mind.

Perhaps, then, the Senator Paul Simon Study Abroad Foundation Act, formerly known as the the Abraham Lincoln Study Abroad Act will now get the support it needs in Congress to be passed as law. With lofty goals to send 1 million US students (representing apporximately 50% of US college graduates) per year overseas for study abroad within 10 years, this would have a dramatic impact on the global higher education sector – particularly since the act is also focused on increasing diversity of both the people going and the destinations. (There is good infomation on the Act available on the NAFSA website)

According to the UNESCO Global Education Digest 48,329 students from the US studied overseas in 2006 – so the ambitions of the Simon legislation represent a dramatic change in pure numbers alone, but it is the side-effects that may prove even more interesting. US institutions will open themselves up to partnership with institutions around the world in a far more proactive way than ever before and whilst these partnerships will begin with exchange and study abroad it seems inevitable that many of them may evolve in to something more. There will also be an associated fiscal injection which, whilst it may be comparatively insignificant to the US institutions supplying the students, may be impactful at some of the institutions in developing countries that are the target.

The social implications for the US – and its international relationships – over the next 50 years could be revolutionary. Even if we ignore the growing numbers in the lead up to the target, the Act aims to send over 40 million American citizens out for study abroad in the next half-century – approaching 15% of their current population. Since only in the region of 20% of US citizens hold a passport (various source data available here), this represents a dramatic shift which can only have long term benefit for the US against the backdrop of globalisation.

It seems the whole world is watching this new President to see whether his walk will match his impressive talk. Universities and their stakeholders are no exception.

The geography of rankings

Some helpful fellow in Germany has plotted the location of the Top 100 universities in both the THE – QS World University Rankings and the Shanghai Jiao Tong exercise on a friendly, interactive Google map to be found here www.university-rankings.net

sjtu_map

Geographic distribution of top 100 universities in Shanghai Jiao Tong's Academic Ranking of World Universities

Geographic distribution of top 100 universities in THE - QS World University Rankings

Geographic distribution of top 100 universities in THE - QS World University Rankings

There are some interesting contrasts between the two maps even when only looking at the Top 100. THE-QS includes institutions in China and Singapore, is more generous towards Australasia, and whilst the picture looks similar in coastal US states, SJTU shows greater favour towards institutions in the Mid-West. Sadly the exercise is currently limited to the top 100 – it would be interesting to see the greater contrast further down the lists and, perhaps, to see how these compare with the results of other ranking exercises, both international and domestic.

Financial factors can be a dangerous measure

by Ben Sowter

 

Many people have frequently suggested that financial indicators ought to be considered in rankings of institutions and the logic is clear – potential things that could be considered are:

  • Average Fees
  • Return on Investment
  • Average Salary upon Graduation
  • Investment per Student
  • Library Spending
  • Total Research Funding
  • Share of Government Research Funding

Whilst this might make a lot of sense in an individual domestic context that may not necessarily be the case when the ranking exercise in question has a broader scope. The fundamental objective of almost any ranking (and there appear to be some exceptions) is to evaluate the performance of the institution. Sadly, most if not all financial indicators are subject to external economic influences which are very difficult to adjust for. This has drawn the conclusion from the THE-QS team that financial indicators are unlikely to ever be practical for the global exercise.

Business schools are amongst the most familiar with rankings – the Financial Times and Business Week rankings, amongst others, have been running for some time and are well established. In contrast to many domestic exercises the FT is very open with its historical results. The chart below shows the number of UK business schools appearing in the top 50 against the Interbank Dollar-Sterling exchange rate on January 1st of each year. Whilst there isn’t a perfect match the trend certainly seems to be that the strength of UK schools is strongly linked to the strength of the pound.

Comparison of UK business school performance in FT rankings against exchange rates

Comparison of UK business school performance in FT rankings against January exchange rates

Sadly, no business school has yet been able to establish a great deal of influence over comparative currency strength or the global economy as a whole. So this effect is completely outside their influence and invalidates the ranking as a whole as a serious performance measure.

The steep decline in the strength of the pound is likely to make 2010 a very difficult year for UK universities in ranking terms and any investment they make in improving teaching or research is unlikely to significantly assist their situation. Graduates from UK schools – although highly international – are likely to accept lower salaries on graduation even outside the UK with exchange rates as they stand and UK salaries are worth less.

No ranking is perfect – even those most established and accepted. Any set of results need to be put in context.

University Rankings: There can be no “right answer”.

by Ben Sowter

 

Part of the excitement of university and business school rankings is that there is no “ultimate solution”. At a symposium at Griffith University in 2007, Nian Cai Liu – who leads Shanghai Jiao Tong’s Academic Ranking of World Universities (www.arwu.org) was posed the question, “Many rankings use surveys as a component of their methodology, why do you choose not to?”. His matter of fact response was “I’m an Engineer”.

But his team’s selection of Nobel Prizes or Highly Cited Authors as indicators are not intrinsically less questionable as measures of university quality in the round – which regardless of stated purpose, the results are often being used for. Three days ago at a comparable event in Madrid, organised by Isidro Aguillo and his Cybermetrics team, similar aspersions were cast on surveys in contrast with more “statistically robust” measures such as link analysis – as used for the Webometrics exercise (www.webometrics.info). The supposition was made that simply because the THE-QS exercise is the most “geographically generous” of the four global aggregate rankings, it must be some how wrong. And that maybe survey bias is to blame for that.

Well I have news for you. THEY ARE ALL WRONG.

The higher profile of universities in China and Hong Kong in THE-QS was cited as evidence for survey bias – whilst it is well-documented on our website that the survey response from China, in particular, is disproportinately low. We are working to remedy this, but it is clearly unlikely to strongly favour Chinese institutions – these universities are perfoming well due to the profile they are building outside China.

Despite the fact that these surveys are currently only conducted in English and Spanish, the survey compenents offer a much reduced language bias than seems to be implied from Nobel Prizes, citations (in any index), cybermetrics, publications in Nature & Science, highly cited authors and many other factors selected by other international evaluations. Respondents, even those responding in English, are cogniscent of the performance of other institutions in their own language – and this seems to be coming through in the results.

Sure, there are biases in the surveys, and the system overall – some are partially corrected for and some are not, but these exist in every other system too even if they may not be quite as immediately evident.

The THE-QS work is presented prolifically around the world – by myself, my colleagues, the THE and third-parties. We present it alongside the other exercises and are always careful to acknowledge that each has its value and each, including our own, has its pitfalls. NONE should be taken too seriously, and to date ALL bear some interest if viewed objectively.

The most entertaining input I have received since conducting this work came from an academic that systematically discredited all of the indicators we have been using but then concluded that, overall, he “liked what we were doing”. It is possible to do that with any of the systems out there – domestic, regional or global. The most savvy universities are using the rankings phenomenon to catalyze and establish keener performance evaluation internally at a faculty, department and individual staff member level. Driving it down to this level can help build actionable metrics as opposed to abstract statisitics and this can lead to a university being able to revolutionise their performance in education and research, and in time, as a side-effect rather than an objective, improve their performance in rankings.

Domestic rankings slow to reveal their past

by Ben Sowter

 

The THE-QS World University Rankings have been in existence now for five editions, the 2009 release will be the sixth. In some way, major or minor, the methodology has changed for each release…

2004
Launch

2005
Add employer review component
Collect full data on 200 additional universities

2006
Switch from 10 years to 5 years for citations measure

2007
Switch from ESI to Scopus for citations
Adopt new normalisation methodology
Insist upon FTE for all personnel metrics
Peer reviewers unable to select own institution

2008
Separate int’l and domestic responses to surveys

Ongoing
Increased response levels to surveys
Tweaks to definitions
New institutions added to study

A ranking is a complex operation and the data available has evolved over time, as has our understanding of it. As we receive feedback and additional metrics become available the responsible thing to do is to integrate new developments with a view to improving the evaluation, making it more meaningful and insightful. The effects of these developments are vivible and reasonalby well-documented  – on our website www.topuniversities.com you can find results going back to 2005.

Recently we have been doing some work on a communication project for a British university. Their leadership is concerned about the conclusions their governing body may infer from the results of rankings and have asked us to make a presentation to explain, in simple terms, some of the shortfalls of rankings and what a change in position might actually mean. In conducting this work, not only did we discover that the two major domestic rankings in the UK are subject to similarly profound “evolutions” in methodology, but that they also seem to be comparatively unforthcoming with their historical results.

On the first point, the introduction and further development of the National Student Survey has had a dramatic influence on results in 2008 and 2009. On the second, the only way we were able to track results over more than the last two editions was to purchase second hand copies of the associated books from Amazon and re-key all the results manually. Similalry the US News rankings seems not to clearly reveal results before the current year. In contrast, both the THE-QS and Shanghai Jiao Tong rankings provide results over a number of years.

Whilst, given the ongoing changes in methodology , it might be misleading to conduct detailed trend analysis over time, the Berlin Principles suggest that transparency is a key expectation for a responsibly conducted rankings. Surely that should include the complete history of a ranking and not simply the most recent edition.