QS World University Rankings by Subject 2015 – challenges and developments

too-much-dataFrom a certain perspective, the work we do at a discipline level ought to be easy. After all, we don’t seek data directly from institutions to compile our rankings by subject which removes a major data collection and validation overhead. However, the scale of the output, in our terms is vast. Our main ranking aggregates performance in 6 indicators for just over 800 institutions and thus comprises around 5,000 individual processed data points; by contrast our rankings by subject use up to four indicators in 36 subjects for up to 400 published results. All in all the full analysis involves well over 40,000 processed data points.

Picking out trends, calibrating the approach, and identifying issues is a major effort. An effort which, I must confess, we underestimated in 2015.

In the coming days we will be releasing fact file information for the new version of the results prior to publication on April 29, and we expect to be similarly beset by questions as to how the results have been formed, what’s changed since the previous fact files we distributed, what can be inferred based on year on year performance and so forth. We’re aiming to give ourselves a little more time to get back to institutions with answers to their specific questions, but the most frequently asked questions are likely to be, what has changed since the previous version?

A substantial majority of institutions have been remarkably constructive and supportive despite previous results, in some cases, appearing to be dramatic downward departure from the previous year. The feedback has been precise, intelligent and constructive with many very specific observations which have been invaluable in our process rebuild. The international forum we ran in Sydney last month, was one of the most engaging events I have had the pleasure to attend. I personally experienced a surprising degree of empathy. There seemed to be a genuine understanding of the fact that this is and has been pioneering work, that it is deeply complex. It also provided us with an invaluable opportunity to listen to genuine experts in their field about what we are doing and how it could be improved – above and beyond any observed concerns about this edition.

We are committed to maintaining an active dialogue with as many stakeholders as possible and deeply appreciate the volume and nature of feedback we have received around this. We have listened, and we have taken the opportunity not only to identify and address some issues with this year’s edition but also to introduce some further refinements based on feedback, which I feel genuinely improves the work.

Our advisory board have also been supportive of the refinements.

The five key changes since the previously distributed, but unpublished, version, have been:

  1. The reintroduction of a regional weighting component in our survey analysis which had been inadvertently omitted
  2. The refinement of our analysis of the Scopus bibliometric database to address an issue where, in some instances, we had been counting articles only in the first subject to which they were categorized
  3. The adjustment of weightings in a further six subjects – making a total of nine subjects with modified weightings in 2015 – typically in favour of the citations and H measures – these changes are supported by the higher volumes of content from Scopus we have been able to retrieve in 2015
  4. The reinstatement of a paper threshold of 10 papers for English, and elevation of paper thresholds in Politics and History reflecting the higher volumes of research we are now taking into account
  5. The extension of our academic and employer survey samples to five years, with the earlier years weighted at 25% and 50% respectively. This stabilizes some of the subjects with lower levels of response and increases our total survey samples for this exercise to 85,062 academics and 41,910 employers

Once the fact files are distributed we will make ourselves available to answer specific enquiries and are currently in the process of scheduling some dedicated webinars to explain the developments in more detail – these will be announced soon. We have already made some changes to our methodology pages and updated response levels, weightings and paper thresholds as well as publishing our map of the ASJC codes used to allocate Scopus content to subjects. Read more here.


Questions concerning the drop in ranking of Sultan Qaboos University

In the QS World University Rankings®, the position of Sultan Qaboos University has dropped from 377 in 2011 to 401-450 in 2012 and now to 501-550 in 2013. The university have contacted us requesting a statement as to the reasons for this and whether or not this reflects a genuine deterioration in performance for the university. The reasons are three-fold:

  1. A decline in performance in our academic reputation measure – a trend shared by many institutions in the region
  2. The inclusion of over 100 additional institutions in 2013, which whilst most are ranked lower than SQU has had an effect since many perform better in some areas thus reducing SQU’s relative score
  3. A genuine data error in our 2011 data collection exercise where total staff were taken instead of academic staff, placing SQU in a higher than deserved position in our faculty student indicator and consequently overall

Based on information provided by the institution more recently it appears that SQU’s previous ranking was incorrect and the subsequent decline in position has largely been down to the correction of this error where in reality in areas such as faculty student ratio, the university itself has been genuinely improving.

QS is committed to transparency, honesty and integrity in compiling our rankings and admitting our mistakes is central to this philosophy. We would like to apologise for any inconvenience caused and are working with the university to notify stakeholders to mitigate any confusion. The following statement has been issued to the university on this basis:

In an exclusive statement, QS Intelligence Unit Head – Ben Sowter observed that most institutions in the Middle East featured in the QS World University Rankings 2013 have dropped in rank this year.

“Scores for academic reputation and research citations have declined across the region this year, which has caused most institutions to lose ground on the international competition,” Sowter said about the Middle East drop in rankings.

Sowter further added that “having said that, there were over 100 new universities added to the list this year; and many of the institutions worldwide already in the rankings have noticeably improved in academic reputation. This has led to some universities, such as Sultan Qaboos University in Oman, showing a drop in ranking, even though their score may have improved relative to last year.”

Commenting on the drop of Sultan Qaboos University, he said, “Sultan Qaboos University was first featured in 2011. We were initially unsuccessful in reaching anyone at SQU in order to file official numbers; hence figures available at the time on the university website were taken for the faculty and student numbers. However, it seems that the number used for academic staff was actually the total number of staff – thus inflating the faculty student ration of SQU resulting in a higher ranking. This was corrected by an official submission in 2012 by the administration at SQU. Since the position published in 2011 was unnaturally high, the drops in 2012 and 2013 have been largely corrective, rather than reflecting a deterioration in SQU’s actual performance”.

He added that “Analysis of our results over time reveal that institutions in general are producing more research, attracting more international research and doing a better job of communication their achievements to the world at large. Increasingly, institutions need to exhibit continuous improvement just to maintain the same position and a drop in overall ranking may not signify an objective deterioration in performance. Such may be the case for SQU”