Posts

UCL

University College London to host IREG-7 conference

By Martin Ince

The IREG-7 conference in London, organised by QS and its partner organisations, is now only a few weeks away. It will be held at University College London, the fourth-ranked institution in the World University Rankings.

The theme for this conference is Employability and Academic Rankings, although there will be sessions on a full range of rankings topics.

To help us think about the link between university rankings and graduate employability in the global market, we have a distinguished panel of speakers from employers including Airbus, Siemens and others. Contributors from universities, and external observers from bodies such as the World Bank, will look at employability and skills as a new measure of higher education performance. This issue has emerged in recent year as a major concern for universities around the world.

There are also to be strong sessions on current and future rankings systems, globally and increasingly regionally, for example in the Middle East and the BRICS nations. An especially strong set of presentations will look at developments in Russia and Eastern Europe. In addition, the QS Asian University Rankings for 2014 will be released on May 13, immediately before the opening of the conference.

We very much hope to see you at IREG-7. The full programme is here and you can register here.

 

 

QS Top 50 under 50

QS started looking into the age of institutions when we took on a fascinating research project with the Australian Technology Network at the beginning of 2011. See this earlier post on the subject: – http://www.iu.qs.com/2011/02/07/influence-of-age-on-university-performance/

In September 2011, we added an age component to our QS Classifications enabling users to easily see some of the different characteristics of institutions featured in the rankings.

A natural step perhaps to produce a table of the strongest “young” institutions. This is not a new ranking, so much as a slice of our world rankings table using age <= 50 as a filter to put the spotlight on some of the rising stars.

Obviously, nothing is ever simple and the exact establishment date of some universities can be difficult to identify – we have marked cases where some form of institution existed prior to the establishment date and separately those that have undergone a merger or split more recently.

We fully expect a few institutions to come forward and let us know that they feel they ought to be included and have not been – we will evaluate each case carefully and make amendments as need be.

Unsurprisingly the results feature many Asian universities and, in just a few years, may feature many more as institutions in the UK and Australia begin to age beyond the scope of the table.

The results can be viewed here: QS Top 50 under 50

Number crunching

We have had a number of requests recently, enquiring how we get from A to B. THese fall into 2 categories:

1.SUBJECT AGGREGATION

University A only ranks 270 in the 2011/2012 university rankings and the faculty rankings rank her 48 in Social Sciences, 51 in Arts &amp; Humanities, 59 in Life Sciences, 89 in Natural Sciences, 115 in Engineering &amp; IT. Shouldn’t be better rank in the global 2011/2012 university rankings? Because we also can observe that some universities such as University B ranks 96 in the 2011/2012 university rankings and their faculty areas only rank 123 in Social Sciences, 66 in Arts &amp; Humanities, 150 in Life Sciences, 210 in Natural Sciences, 244 in Engineering &amp; IT. So, i can suppose the University A should be better ranked.

This case is simple. The faculty area rankings we produce are based on academic reputation only. When aggregated they contribute only 40% of the overall score with another five indicators making up the difference. In the above case University A will undoubtedly rank better for Academic Reputation but will be let down by aspects such as Faculty Student, Employer Reputation and Citations per Faculty which are only considered at an overall level

2. AGGREGATION BY INDICATOR RANK

Some institutions have questioned why their overall rank does not necessarily fall intuitively amongst the range of ranks for the indicators. This again, is relatively simple, the indicator ranks are not perfectly correlated, leading to the situation where institutions that are comparatively consistent across the board are likely to do better than institutions that might be dynamite in some areas and weak in others. Consider the following example:

Institution A is ranked 250th in all indicators and scores an overall rank of 225.

Institution B is rankined 230th in all indicators, except faculty student ratio where it ranks 610th, it has an overall rank of 270th.

This is a hypothetical simplified example to demonstrate the point. The individual indicator ranks for Institution A have been displaced by Institution B in all indicators but one and as a result of combining this with additional comparable examples that affect all indicators evenly (in this model) the rank of Institution A seems higher than intuition might expect.

Ultimately overall ranks are based on aggregated scores and the differeng characteristics of the indicators can easily produce circumstances where the overall rank may appear to not be an accurate aggregation of the separate indicator ranks.

2011 rankings season draws to a close

By John O’Leary, QS academic Advisory Board

This week sees the end of the international rankings season, with QS publishing the first-ever comparisons of Latin American universities and Times Higher Education (THE) issuing the second edition of its global rankings with Thomson Reuters.

The moment provides an opportunity to take stock of the main rankings before yet more organisations join the field. The European Commission, for example, may soon publish the first results from its U-Multirank project, while the OECD is still piloting its Assessment of Higher Education Learning Outcomes (AHELO) initiative, which tests students in different countries in a range of subjects from economics to engineering. Probably the most significant development of 2011 was the publication by QS of the first rankings by individual subject.

The 26 tables are the initial response to a demand from prospective students for more granular information on the university departments in which they will actually study. There will be considerable interest in the academic community this week in the changes in methodology made by THE. The magazine’s attempt to broaden the focus of international rankings was welcomed by many of its readers, but the flaws in its original methodology underlined the difficulties inherent in such an approach. Read more

A look at the EUA’s Global University Rankings Report

by Martin Ince, convener of the QS Academic Advisory Board

Last month the European University Association, the representative body for higher education in 47 European nations, produced its report on Global University Rankings. The media reports suggest that it is critical of rankings while accepting that they are not going to go away. But what is its real message?

Written by Andrejs Rauhvargers of Latvia, the report concedes that students and their advisers find university rankings valuable, and that media and information firms appreciate the interest they raise. (This is certainly true of QS.) For these reasons, rankings local and global are certain to continue.

But despite the useful service that rankings provide for students and other audiences, the EUA report has reservations about their value. It begins by pointing out that the criteria used in rankings are chosen and weighted by the rankings compilers, giving them influence over what counts as university quality.

However, rankings compilers might reply that their criteria, and the weightings applied to them, have to come from somewhere. In the case of the QS rankings, the criteria used have been developed over time to be robust and reliable and to reflect as many aspects as possible of university life. At QS, we also have an active Advisory Board, made up of distinguished academic advisers from around the world who help us to think about these issues.

This misunderstanding is in keeping with the report’s extraordinary ignorance of QS’s World University Rankings. Its author seems not to know that we published the World University Rankings in 2010, the seventh in an unbroken series using comparable methodology. They have been seen by millions of people around the world online and in print. (He has noticed our collaboration with US News and World Report, one of our media partners.) This muddle suggests that at the very least, this report should be withdrawn in its current form and a corrected version should be issued. And incidentally, we have never seen our work as the “European answer to ARWU,” the Shanghai rankings. Read more

HE News Brief 17.5.11

by Abby Chau

IN THIS EDITION

  • ENGLAND: Universities Minister David Willetts continues to draw fire for his HE proposals
  • UK: The Guardian has just released its list of top UK universities, with Cambridge topping the league table
  • GERMANY: Universities are overcrowded and many are calling for the reforms
  • FRENCH: New internationalisation strategy to target mobile students
  • AUSTRALIA: Losing its grip on mobile students
    Read more

Despite the dominance of the US and UK at the top of the table, subject rankings show that not all the best engineering happens at top institutions

by Martin Ince, a member of the QS Advisory Board

Engineering and information technology, the first subjects to be analysed in the QS World University Rankings by Subject, are popular with students, who appreciate the good careers they can lead to. Politicians, too, appreciate their importance. They supply the skilled people needed by manufacturing, which despite the growing importance of service industries remains a key source of prosperity and of well-paid jobs in export industries.

These five tables reveal that ambitious, internationally-mobile students of these subjects are likely to find themselves boarding a flight for the United States. The Massachusetts Institute of Technology is top on all five measures. As well as MIT, we see the big names of the US East and West coasts, including Stanford, Berkeley, Caltech and Harvard, in prominent positions.

But a detailed look at the tables shows that in practice, student choice is not so clear-cut. In chemical engineering, 25 of the universities in our top 50 are in the US, but the other 25 are not. For civil engineering, only 15 are in the US. Even for computing, 29 of the top 50 are not in the nation that gave us IBM, Google, Microsoft and Apple. For electrical engineering, 30 of the top 50 are outside the US, and for mechanical engineering, 27.

So these tables point towards universities that might provide high-quality engineering training for less eye-watering sums than study at MIT and its US rivals involves.

And whilst MIT is regarded by the academics in our survey as top in all five of these subjects, theirs is only one view. Employers see Cambridge – the one in the UK – as the world’s best place to recruit computer graduates. In chemical and civil engineering, they prefer both Oxford and Cambridge in the UK to MIT. For mechanical and electrical engineers, they prefer Harvard.

Read more

Assessing global higher education

by John O’Leary, a member of the QS Academic Advisory board

How many different ways are there to assess global higher education? The QS subject rankings, the first of which will appear on April 5, represent one new way, giving students an international guide to quality in individual subjects for the first time.

The first five subjects, rated by academic and employer opinion as well as by citations, will cover computer science and four branches of engineering. Rankings in another 27 subjects will appear by the end of May.

The new rankings will be the first to give students an idea of employers’ views of the leading universities in individual subjects – something that is particularly important in disciplines such as engineering, where graduates tend to go into jobs directly related to their course. As a result, the big international employers often have a more sophisticated view of the qualities of graduates from different universities than those who recruit from the full range of subjects.

As the QS rankings of whole universities have shown, there can be subtle differences in the views of employers and academics on which are the leading universities. To have this knowledge at subject level will be an important addition to the information used by students in choosing universities across the world.

Whether they will be as enthusiastic about some of the other rankings that have come out recently is more doubtful. Times Higher Education, for example, having previously complained that the 40 per cent reputational element in the QS rankings was too high, published a ranking that was 100 per cent reputation and entirely derived from its last set of tables.

More innovative, but still of questionable value to students was the British Council’s Global Gauge, which purported to judge which countries were the most international in the higher education sphere. Perhaps surprisingly, Germany came out ahead of Australia in second place, the UK (third) and the USA (sixth). China and Malaysia were fourth and fifth respectively.   Read more

QS Subject Rankings 2011 – Engineering & Technology

QS Subject Rankings are out, check out more results here.

Chemical Engineering Rankings 2011

QS Rankings Results: Engineering – Chemical
Rank Institution Country Academic Employer Citations Score
1   Massachusetts Institute of Technology (MIT) United States 100 84.9 82.2 90.1
2   University of Cambridge United Kingdom 85.4 93.8 56.8 79.3
3   University of California, Berkeley (UCB) United States 87.9 64.4 77.5 77.7
4   University of Oxford United Kingdom 71.5 94.1 60.3 74.9
5   Stanford University United States 73.3 73.9 75.1 74
6   University of California, Los Angeles (UCLA) United States 55.1 63.7 83.1 66.1
7   California Institute of Technology (Caltech) United States 72.6 43.8 67.8 62.5
8   Imperial College London United Kingdom 69.9 65.7 47.8 62
9   Yale University United States 48.9 62 77.3 61.4
10   National University of Singapore (NUS) Singapore 51.5 57.4 68.2 58.3

Going Global 2011: Can university rankings play a useful role?

by Danny Byrne, Editor of Topuniversities.com

“We should not, as academic institutions, abandon academic rigour, seduced by the spotlight of international rankings.” So argued Malcolm Grant, Vice Chancellor of University College London, at the Going Global Conference in Hong Kong earlier this month. Most intelligent commentators recognise that university rankings only capture certain aspects of institutional performance. But is that sufficient cause to write them off altogether?

Grant was taking part in a session entitled International Rankings: Where Do You Stand?, the aim of which was to establish whether or not university rankings actually provide any useful information. Or, in the terms chosen by the organisers, “Are rankings capable of playing a helpful role in enhancing communications and understanding to stakeholders in today’s global market for higher education?”

The objections to rankings voiced by Grant were part conceptual, part practical. He listed eight major ‘fracture points’ that undermine the validity of established rankings methodologies. Grant drew attention to the proxies rankings are forced to employ in the absence of direct data on some areas of university life, and the problematic nature of applying weightings to indicators.

“How do we weight data in a manner which is not only transparent, but which is intellectually compelling?” Grant asked. “I believe that this remains the biggest single drawback of the rankings that we have at the moment.” However, Grant, whose own university UCL recently announced plans to charge annual tuition fees of £9,000 from 2012, did not identify an alternative means by which students might compare it with other institutions around the world.

Read more