Using rankings to set policy and funding criteria

by Ben Sowter

I recently received an email from a professor at a Spanish university. In a nutshell, his university had revised its funding policy guidelines to include the criterion that PhD students should have taken their undergraduate program at a university within the top 500 in Webometrics in order to be eligible for funding. Before applying this criterion, he had a PhD candidate from the University of Mumbai that was placed 3rd, introducing it dropped her to 7th and ineligible for a grant. The professor pointed out the University of Mumbai’s position of 155 in our ranking but this was dismissed by the committee due to the fact that QS is a commercial entity and thus our observations somehow invalid.

Our response (below) may make for interesting reading – it’s not just about promoting the strengths of the QS approach to ranking but also about how rankings might more responsibly be applied to this kind of context.

Dear Dr. XXXX,

Many thanks for your email.

I apologise for not responding sooner but I have been away on business. This is obviously a complex issue but please let me try and break it down for you and your colleagues.

Usage of (any) rankings for policy or funding decisions

There are some dangers here. Methodologies are subject to change for sure and placing funding and policy decisions in the hands of other people may not be the intention, in general terms we recommend our results are used for guidance rather than to set policy but we would have some recommendations about the nature of application of the data – whether it is our results or anybody else’s being used.

  1. Use a (minimum) three-year average. Universities generally do not change in quality dramatically from one year to the next but rankings can amplify any change. Additionally there is usually a time-lag on the impact of performance on rankings which works both ways. Using a three-year average will result in less erratic decision making
  2. Apply a sliding scale. It is one thing for rankings to be considered but quite another for it to be black and white and funding to be cut entirely or students be deported (we have a case of this in Denmark) because their undergraduate institution moves a few places. For example if currently the institution has to be within 500 apply a points based system between say 500 and 750 so that a university moving a few places will have an impact on funding but not such a dramatic one.
  3. Consider a country by country approach. Rather than looking at the top 500 overall, look at the top x or top x% within a given country. The fact is that these rankings all apply a common template to global universities regardless of whether their domestic environment “fits”.

The commercial status of QS with respect to other rankings providers

The ARWU (Shanghai Ranking) is now produced by ShanghaiRanking Consultancy, a commercial, for-profit entity spun off from Shanghai Jiao Tong University. Their website carries paid for banner advertising on the right hand side just like ours does. Is this a reason not to consider their work as an input? I would argue that it isn’t. US News & World Report is perhaps the world’s longest running and most established university ranking – this is a commercial, for-profit media group. The Financial Times produce what is probably the most referenced, respected and established international ranking of business schools – again in pursuit of commercial objectives. In none of these cases  is it arguable that the quality and accuracy of the research is compromised by the nature of the organisation undertaking it.

QS is a business entirely centred on the Higher Education sector, we are not a newspaper to whom this work might be considered an annual crowd-pleaser, we are not a university for whom this might just be a one-time research project, we are not a government organisation that may have a particular agenda to promote the qualities of universities in their own country. The pressure on us to conduct our work responsibly and accurately is greater than that for anyone else as our business and our reputation rests upon it. As a result the staff members on my team with direct involvement in the rankings have clauses in their contracts explicitly prohibiting them from influencing the results for commercial interest and promising disciplinary procedures and potential termination should such practices be discovered.

Of course, our results are published on a web page that attracts a lot of traffic and as a result attracts advertisers too but this in no way influences the performance of the universities in the results.

The strengths of the QS rankings

The reputation surveys used in 2010 drew on responses from over 15,000 academics and over 5,000 employers. These have become amongst the most statistically significant, language and discipline independent indicators of university quality worldwide and form the central tenet of why the University of Mumbai appears more strongly in our results than others. Though no published data seems to exist, the university is vast, on a par with the Universities of Delhi and Calcutta. It has over 400 affiliate colleges, some very small and some that look like comprehensive universities in their own right like St. Xavier’s College for example (

Tracking an organisation of this size and nature through hard data and bibliometrics is almost impossible. Even doing so through Webometrics when each affiliate college and research institute has its own separate domain is similarly impossible, yet try speaking to an Indian academic about the universities they consider to be excellent in India or an Indian employer about the universities they like to target for recruitment, then the University of Mumbai or one of its affiliate colleges will be on their list almost every time.

Ultimately, individuals and organizations worldwide make their judgements of which rankings to use and for what purpose every day. Webometrics has its advantages – particularly for the number of universities and institutions it can track – but it is also subject to anomaly when universities have different policies on open access to research and course materials or make changes in the way their websites and domains are deployed and structured. If you were able to aggregate all the affiliate colleges of the University of Mumbai in Webometrics you would probably find it well within the top 500 – similarly with Delhi and Calcutta. The reality is that the way that Indian higher education is currently structured is simply not compatible with many global indicators of quality – our survey based indicators are an exception.

Let’s put this another way – India is very clearly one of the greatest sources of young, ambitious, academically capable talent in the world. There are a billion people in India with an increasing youth demographic and the best of them are exceptional. This is an indisputable fact.

The highest placed Indian institution in Webometrics that educates undergraduates is the Indian Institute of Technology Kanpur (see below) placed at position number 564. The Indian Institute of Science doesn’t teach undergraduates:

Top Indian institutions in Webometrics

Top Indian institutions in Webometrics

So the reality is this, that the policy currently established at your institution overtly prevents your institution from awarding funding to ANY of the afore-mentioned Indian talent. Of over 1500 India institutions listed by Webometrics, the University of Mumbai (even without its affiliated colleges) places 25th putting it in the 98th percentile amongst institutions in India.


There are really two ways to look at this.

  1.  Use our rankings (amongst others) – despite the views of your committee, they are absolutely independent, well-established and offer a valid alternative view to others.
  2. Whichever rankings you opt to use do not make the mistake of applying a one-size fits all policy as it will do your institution and your students a direct dis-service – applying the thinking above ought to validate the consideration of the University of Mumbai even if you stick with Webometrics. Young peoples’ futures rest on these policy decisions and they need to be taken responsibly, seriously and sensitively.

I hope this helps and if I can be of further assistance please let me know.

Many thanks,

Ben Sowter
Head of Division
QS Intelligence Unit


0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *