Inclusion in Rankings

QS World University Rankings® first began in 2004 and one of the first challenges was to identify an initial list of institutions to study further. For simple practical reasons, it would have been impossible to execute a methodology such as that set forth in these pages for every university in the world. At a UNESCO event in 2011 it was estimated that there are around 20,000 universities in the world. Beginning with the world’s top 500 universities based on citations per paper, the list has evolved since 2004 in response to a number of stimuli:

  • Domestic Ranking Performance – the QS Intelligence Unit tracks a growing number of domestic rankings in an attempt to ensure prestigious universities are not excluded
  • Survey Performance – respondents to the Academic and Employer Reputation Surveys are invited to suggest any institutions they feel may have been omitted
  • Geographical Balancing – acknowledging that universities have different priorities and characteristics in different parts of the world, the balance of institutions from given countries and regions is periodically reviewed
  • Direct Case Submission – from time to time institutions approach QS directly to request inclusion, QSIU evaluates each case on its merits drawing comparison against institutions already included in the ranking and, subject to certain pre-requisites and performance indicators being met is open to including additional institutions

In 2012 the surveys featured over 3,000 institutions, with over 700 being evaluated at either an indicator or overall level in the QS World University Rankings®.

We recognise that higher education institutions can be very different from one another, but maintain that there is validity in comparing one against another as they usually have a certain number of common objectives – for most these include the pursuit of cutting-edge research and the education of first-rate students. There are certain kinds of institution that may appear in other evaluations but are excluded either entirely or partly from our study. These are:

Research Institutes

Whilst this study does look at research metrics it was considered inappropriate to include research institutes that do not have students. Notable exclusions on this basis include CERN in Switzerland, CNRS in France, the Max Planck Institute in Germany and the Russian Academy of Sciences. It is worth noting that, in countries where much of the research takes place in such separate facilities, the research measures for the universities themselves sometimes underestimate the research strength of the faculty members.

Single Faculty Institutions

Institutions that focus on only one of our five broad faculty areas tend to be smaller and more intensive and also feel the full influence of any factors that affect their area of strength. These institutions are able to appear in faculty area and indicator tables but are excluded from our overall list. Notable cases, include the Karolinska Institute in Sweden, HEC Paris, and Bocconi in Italy.

Single Level Institutions

Institutions that operate at either undergraduate only, or more commonly postgraduate only level have certain natural advantages in areas such as student faculty ratio or citations per faculty that would lead to anomalous placing in our overall table. Again these are permitted to appear in faculty area or indicator tables, but are excluded from the aggregate list. Notable exclusions include Cranfield University in the UK, GIST (Gwangju Institute of Science & Technology) in South Korea and Jawaharlal Nehru University in India.

Institutions traditionally operating at one level, but recently introducing degree-level programs at the other, can be considered for inclusion a minimum of three years after the first class graduate from programs defined as within at least two of our five broad faculty areas.

Survey Solicitation or Promotion

We encourage universities and other higher education stakeholders to help us build the universe of respondents to our academic and employer surveys. However, It is not permitted to solicit or coach specific responses from expected respondents to any survey contributing to any QS ranking. Institutions found to be doing so in the opinion of the QS Intelligence Unit will be sanctioned. The approach we have adopted is intended to obviate the effect, if any, of such campaigns by excluding new data from any non-compliant institution, and to neutralize the effect of any such campaigns on our survey findings.

The rankings use five years of survey data, which for the forthcoming 2016/17 rankings will mean information gathered from 2012 to 2016 inclusive. For affected institutions, we shall use data from 2011 to 2015, and we shall omit survey data from 2016 in perpetuity. The same principle will apply to any institution contravening our policies in the years to come.

QS aims to provide an inclusive and accurate ranking. But in cases of recurrent activity of this nature, we will first apply the above approach to the survey index in question, and we may later consider disqualifying an institution from the ranking altogether. QS runs sophisticated screening analysis to detect anomalous responses, and routinely discards invalid responses. Any attempt to manipulate the results, or to solicit responses, may result in the disqualification of all responses for that survey for that year, invalid or otherwise, where the source cannot be verified as entirely independent.

The above policy takes effect from May 2016

False Responses

In an attempt to build the largest and most representative sample for its surveys, QSIU casts the net as wide as possible, but extensive checking and validating of response data takes place to check for the possibility of institutions attempting to influence their position through submitting additional responses on their own behalf.

As policy, not only are responses found to be invalid discounted from consideration but any institution found to be engaging in such activity will attract a further penalty in the compilation of results for the given indicator.