Overview

HEW Newsletter – Rankings Results 2015/16: An Overview

Despite the improved methodology described elsewhere in this issue of Higher Education World, the 2015/16 QS World University Ranking agree with last year’s on one thing: the Massachusetts Institute of Technology is the world’s top university. It has near-perfect scores on five of our six measures, and comes 62nd in the world on the other, its percentage of international students.

The stability of these rankings is also evident from the fact that the same institutions fill the top eight places in the Ranking as last year, although MIT is the only one in the same position. The most spectacular move affects Imperial College, London. It is down from second to eighth place, largely because of a 59-place fall in its citation per faculty member count. This is likely to be due mainly to the reduced emphasis that we now place upon excellence in biomedicine.

Read more

Methodology

Methodology refinements explanations

The methodology used to create the World University Rankings uses six robust measures which encapsulate the principal activities of global higher education. These measures are unchanged for the new 2015/16 Rankings. But as we explain here, the use we make of the data we collect has been improved markedly this year.

The first two of these measures involve asking informed people to identify the high points of the world university system. We do this by means of two annual surveys, one of active academics around the world, and one of recruiters. The academics are asked what their subject is and where the top 30 universities are in that field, although they tend to vote for a median of about 20. They cannot vote for their own institution. The employers are asked to name the subject or subjects in which they recruit graduates, and where they like to recruit them. These two measures account for 40 per cent and 10 per cent respectively of each institution’s possible score in this ranking.

Read more

QS_GER

New Rankings: QS Graduate Employability Rankings

The QS Intelligence Unit is proud to announce the future release of a new ranking focusing on employability outcomes for the graduates.

Leonardo Silveira, in charge of the project in London, told Tamara Sredojevic about the QS Graduate Employability Rankings:

What is the QS Graduate Employability Rankings?

The QS Graduate Employability Rankings comes from an extensive research project which has been running since October 2014. This project has aimed to design a new approach and methodology on employability in university rankings.

Following the research project, we are going to launch a first edition this November at the 11th QS-APPLE in Melbourne. As a pilot initiative, this new rankings will not at first alter the other QS University Rankings results.

What initiated the QS Graduate Employability Rankings?

So far, employability has been approached in the most prominent rankings solely by using employer reputation data. But it has also always been one of the main differentiators of the QS University Rankings.

Thereby, after dealing with this subject for many years, we realized there was a huge demand both from students and universities to get in-depth information on employability outcomes after graduation. This is why we decided to create a whole new ranking dedicated to employability. Read more

EECA post

Emerging Europe and Central Asia Ranking – The Significance?

In December 2014, QS published the first edition of the Emerging Europe and Central Asia (otherwise known as EECA) regional ranking. New? Definitely. Interesting? Certainly. Unexpected? Not at all.

I, myself, come from Eastern Europe and so have first-hand experience of education in that region. Whilst I haven’t studied at a university there, I received primary and best part of my secondary education there. I therefore know that there are very few parts of the world as thorough and as methodical in their teaching philosophy as the educators from Emerging Europe and Central Asia. If there’s one thing this says about them is that these nations take education very seriously and have a thing or two to share with the rest of the world. Read more

Changes ahead

Potential refinements in the QS World University Rankings 2015

Anyone who has seen me present will know that one of my most frequently used quotes is from the US statistician, George Box, said, “Essentially all models are wrong, but some are useful”. Rankings are controversial as much because they are imperfect, incomplete as anything else. Were there a perfect answer, and had someone found it, there would be no space for debate, discussion and disagreement.

The QS World University Rankings were one of the first, and remain one of the most popular, international rankings of universities. Part of this popularity has been in their simplicity and part in their consistency – six weighted indicators drawn together to present a simple table representing a global hierarchy of world universities.

Despite the basic framework remaining the same since 2005, QS has not been afraid to listen and make refinements. Switching to Elsevier’s Scopus database in 2007 was one such change. One of the well-known challenges in developing metrics from a bibliometric database like Scopus is taking into account the different patterns of publication and citation across discipline areas. Various efforts have been made to address this problem, perhaps with the Leiden Ranking being the leading protagonist. Read more

shutterstock_274578584

Why do students want to study abroad?

Studying abroad is a wonderful, professionally and personally enriching experience. It’s no wonder it’s becoming increasingly popular, with numbers going up from 2 to 4 million students in just the last decade. But what is it students are looking for overseas?

Just in March we interacted with over 500 students from Italy, France, Moscow and UK, with the intention to find out what they value in a university. We were particularly intrigued to see if there’ll be any variation by country.

This is what we found:

Read more

how_do_students_use_rankings

Rankings – What Do The Students Think?

We were celebrating the 10-year anniversary of QS World University Rankings® last September, marking 10 editions of one of the most sought-after rankings in the world. Who’s interested? Academics, university leadership, media organisations, governments – and, of course, students.

Whilst it may be evident that rankings are growing in popularity and influence (QS is certainly not the only organisation to produce rankings either) and although we know millions of students consult the rankings every year, it is unclear how they use them and just what the impact is. Given the primary audience we compile our rankings for is prospective international students, we set off on a research project to answer these questions.

This new report initiated by the QS Intelligence Unit, unambiguously titled ‘How Do Students Use Rankings?’, explores student motivations when selecting a university, with a view to better understanding the role rankings play in the journey from being a perspective student to becoming a graduate.

How important is the rank of an institution compared with other factors such as course specification, location and student experience?
• Why study abroad and in an internationally recognised institution?
• How are you choosing what and where to study?

These are some of the questions we asked the students we met at QS international education fairs. The trends presented in the report are primarily based on a series of 11 focus groups held in London, Paris, Milan, Rome and Moscow, involving a total of 71 prospective students. We additionally ran a survey, collecting 519 responses, which allowed us to provide a balanced perspective based on a mix of qualitative and quantitative data.

Our findings were enlightening, yet completely in line with what one would expect to be on a prospective student’s mind. Whilst students shared a variety of ways in which they use the rankings and a wide range of priorities, when we pushed for the ‘absolute’ driving motivation, they overwhelmingly gave the same answer…

Read more

too-much-data

QS World University Rankings by Subject 2015 – challenges and developments

too-much-dataFrom a certain perspective, the work we do at a discipline level ought to be easy. After all, we don’t seek data directly from institutions to compile our rankings by subject which removes a major data collection and validation overhead. However, the scale of the output, in our terms is vast. Our main ranking aggregates performance in 6 indicators for just over 800 institutions and thus comprises around 5,000 individual processed data points; by contrast our rankings by subject use up to four indicators in 36 subjects for up to 400 published results. All in all the full analysis involves well over 40,000 processed data points.

Picking out trends, calibrating the approach, and identifying issues is a major effort. An effort which, I must confess, we underestimated in 2015.

In the coming days we will be releasing fact file information for the new version of the results prior to publication on April 29, and we expect to be similarly beset by questions as to how the results have been formed, what’s changed since the previous fact files we distributed, what can be inferred based on year on year performance and so forth. We’re aiming to give ourselves a little more time to get back to institutions with answers to their specific questions, but the most frequently asked questions are likely to be, what has changed since the previous version?

A substantial majority of institutions have been remarkably constructive and supportive despite previous results, in some cases, appearing to be dramatic downward departure from the previous year. The feedback has been precise, intelligent and constructive with many very specific observations which have been invaluable in our process rebuild. The international forum we ran in Sydney last month, was one of the most engaging events I have had the pleasure to attend. I personally experienced a surprising degree of empathy. There seemed to be a genuine understanding of the fact that this is and has been pioneering work, that it is deeply complex. It also provided us with an invaluable opportunity to listen to genuine experts in their field about what we are doing and how it could be improved – above and beyond any observed concerns about this edition.

We are committed to maintaining an active dialogue with as many stakeholders as possible and deeply appreciate the volume and nature of feedback we have received around this. We have listened, and we have taken the opportunity not only to identify and address some issues with this year’s edition but also to introduce some further refinements based on feedback, which I feel genuinely improves the work.

Our advisory board have also been supportive of the refinements.

The five key changes since the previously distributed, but unpublished, version, have been:

  1. The reintroduction of a regional weighting component in our survey analysis which had been inadvertently omitted
  2. The refinement of our analysis of the Scopus bibliometric database to address an issue where, in some instances, we had been counting articles only in the first subject to which they were categorized
  3. The adjustment of weightings in a further six subjects – making a total of nine subjects with modified weightings in 2015 – typically in favour of the citations and H measures – these changes are supported by the higher volumes of content from Scopus we have been able to retrieve in 2015
  4. The reinstatement of a paper threshold of 10 papers for English, and elevation of paper thresholds in Politics and History reflecting the higher volumes of research we are now taking into account
  5. The extension of our academic and employer survey samples to five years, with the earlier years weighted at 25% and 50% respectively. This stabilizes some of the subjects with lower levels of response and increases our total survey samples for this exercise to 85,062 academics and 41,910 employers

Once the fact files are distributed we will make ourselves available to answer specific enquiries and are currently in the process of scheduling some dedicated webinars to explain the developments in more detail – these will be announced soon. We have already made some changes to our methodology pages and updated response levels, weightings and paper thresholds as well as publishing our map of the ASJC codes used to allocate Scopus content to subjects. Read more here.