Inclusion in Rankings
QS World University Rankings® first began in 2004 and one of the first challenges was to identify an initial list of institutions to study further. For simple practical reasons, it would have been impossible to execute a methodology such as that set forth in these pages for every university in the world. At a UNESCO event in 2011 it was estimated that there are around 20,000 universities in the world. Beginning with the world’s top 500 universities based on citations per paper, the list has evolved since 2004 in response to a number of stimuli:
- Domestic Ranking Performance – the QS Intelligence Unit tracks a growing number of domestic rankings in an attempt to ensure prestigious universities are not excluded
- Survey Performance – respondents to the Academic and Employer Reputation Surveys are invited to suggest any institutions they feel may have been omitted
- Geographical Balancing – acknowledging that universities have different priorities and characteristics in different parts of the world, the balance of institutions from given countries and regions is periodically reviewed
- Direct Case Submission – from time to time institutions approach QS directly to request inclusion, QSIU evaluates each case on its merits drawing comparison against institutions already included in the ranking and, subject to certain pre-requisites and performance indicators being met is open to including additional institutions
In 2012 the surveys featured over 3,000 institutions, with over 700 being evaluated at either an indicator or overall level in the QS World University Rankings®.
We recognise that higher education institutions can be very different from one another, but maintain that there is validity in comparing one against another as they usually have a certain number of common objectives – for most these include the pursuit of cutting-edge research and the education of first-rate students. There are certain kinds of institution that may appear in other evaluations but are excluded either entirely or partly from our study. These are:
Whilst this study does look at research metrics it was considered inappropriate to include research institutes that do not have students. Notable exclusions on this basis include CERN in Switzerland, CNRS in France, the Max Planck Institute in Germany and the Russian Academy of Sciences. It is worth noting that, in countries where much of the research takes place in such separate facilities, the research measures for the universities themselves sometimes underestimate the research strength of the faculty members.
Single Faculty Institutions
Institutions that focus on only one of our five broad faculty areas tend to be smaller and more intensive and also feel the full influence of any factors that affect their area of strength. These institutions are able to appear in faculty area and indicator tables but are excluded from our overall list. Notable cases, include the Karolinska Institute in Sweden, HEC Paris, and Bocconi in Italy.
Single Level Institutions
Institutions that operate at either undergraduate only, or more commonly postgraduate only level have certain natural advantages in areas such as student faculty ratio or citations per faculty that would lead to anomalous placing in our overall table. Again these are permitted to appear in faculty area or indicator tables, but are excluded from the aggregate list. Notable exclusions include Cranfield University in the UK, GIST (Gwangju Institute of Science & Technology) in South Korea and Jawaharlal Nehru University in India.
Institutions traditionally operating at one level, but recently introducing degree-level programs at the other, can be considered for inclusion a minimum of three years after the first class graduate from programs defined as within at least two of our five broad faculty areas.