Inclusion in Rankings
QS World University Rankings® first began in 2004 and one of the first challenges was to identify an initial list of institutions to study further. For simple practical reasons, it would have been impossible to execute a methodology such as that set forth in these pages for every university in the world. At a UNESCO event in 2011 it was estimated that there are around 20,000 universities in the world. Beginning with the world’s top 500 universities based on citations per paper, the list has evolved since 2004 in response to a number of stimuli:
- Domestic Ranking Performance – the QS Intelligence Unit tracks a growing number of domestic rankings in an attempt to ensure prestigious universities are not excluded
- Survey Performance – respondents to the Academic and Employer Reputation Surveys are invited to suggest any institutions they feel may have been omitted
- Geographical Balancing – acknowledging that universities have different priorities and characteristics in different parts of the world, the balance of institutions from given countries and regions is periodically reviewed
- Direct Case Submission – from time to time institutions approach QS directly to request inclusion, QSIU evaluates each case on its merits drawing comparison against institutions already included in the ranking and, subject to certain pre-requisites and performance indicators being met is open to including additional institutions
In 2012 the surveys featured over 3,000 institutions, with over 700 being evaluated at either an indicator or overall level in the QS World University Rankings®.
We recognise that higher education institutions can be very different from one another, but maintain that there is validity in comparing one against another as they usually have a certain number of common objectives – for most these include the pursuit of cutting-edge research and the education of first-rate students. There are certain kinds of institution that may appear in other evaluations but are excluded either entirely or partly from our study. These are:
Whilst this study does look at research metrics it was considered inappropriate to include research institutes that do not have students. Notable exclusions on this basis include CERN in Switzerland, CNRS in France, the Max Planck Institute in Germany and the Russian Academy of Sciences. It is worth noting that, in countries where much of the research takes place in such separate facilities, the research measures for the universities themselves sometimes underestimate the research strength of the faculty members.
Single Faculty Institutions
Institutions that focus on only one of our five broad faculty areas tend to be smaller and more intensive and also feel the full influence of any factors that affect their area of strength. These institutions are able to appear in faculty area and indicator tables but are excluded from our overall list. Notable cases, include the Karolinska Institute in Sweden, HEC Paris, and Bocconi in Italy.
Single Level Institutions
Institutions that operate at either undergraduate only, or more commonly postgraduate only level have certain natural advantages in areas such as student faculty ratio or citations per faculty that would lead to anomalous placing in our overall table. Again these are permitted to appear in faculty area or indicator tables, but are excluded from the aggregate list. Notable exclusions include Cranfield University in the UK, GIST (Gwangju Institute of Science & Technology) in South Korea and Jawaharlal Nehru University in India.
Institutions traditionally operating at one level, but recently introducing degree-level programs at the other, can be considered for inclusion a minimum of three years after the first class graduate from programs defined as within at least two of our five broad faculty areas.
Survey Solicitation or Promotion
It is not permitted to independently promote participation in QS surveys, nor to solicit or coach specific responses from expected respondents to any survey contributing to any QS ranking. Should the QS Intelligence Unit receive evidence of such activity occurring, institutions will receive one written warning, after which responses to that survey on behalf of the subject institution may be excluded altogether for the year in question.
QS aims to provide an inclusive and accurate ranking, but in cases of recurrent activity of this nature will first apply a score penalty to the survey index in question and may consider disqualifying an institution from the ranking altogether. QS runs sophisticated screening analysis to detect anomalous patterns in response and routinely discards invalid responses, any attempt to manipulate the results or to solicit responses, may result in the disqualification of all responses for that survey for that year, invalid or otherwise, where the source cannot be verified as entirely independent.
The above policy takes effect from April 2013, before which time the stated policy was as follows:
It is not permitted, to solicit or coach specific responses from expected respondents to any survey contributing to any QS ranking. Should the QS Intelligence Unit receive evidence of such activity occurring, institutions will receive one written warning, after which responses to that survey on behalf of the subject institution may be excluded altogether for the year in question.
It is acceptable and even encouraged for institutions to communicate with employers and academics worldwide to showcase their achievements. Institutions are welcome to invite contacts to sign up for possible selection for our survey using our Academic or Employer Sign Up Facilities, but any message soliciting a specific response in our surveys, represents unfair manipulation of the results and will not be tolerated.
In an attempt to build the largest and most representative sample for its surveys, QSIU casts the net as wide as possible, but extensive checking and validating of response data takes place to check for the possibility of institutions attempting to influence their position through submitting additional responses on their own behalf.
As policy, not only are responses found to be invalid discounted from consideration but any institution found to be engaging in such activity will attract a further penalty in the compilation of results for the given indicator.