With this week’s release of the 2013 QS World University Rankings by Subject, we produced a series of videos, which explain in a somewhat unusual manner, the methodology behind compiling our Subject Rankings.
The first of the series presents the Top 10 Universities in Computer Science and gives a sneak peek of how we ‘really’ come up with our results!
The second important innovation in world rankings of universities in recent years has been the introduction of regional comparisons. QS published the first Asian ranking in 2009 and followed it in 2011 with the inaugural Latin American ranking.
Both supplemented the measures used in the QS World University Rankings with new criteria designed to reflect the priorities of the region. The results focused attention on universities of regional or national importance that do not feature prominently in the world rankings.
This year’s Latin American rankings will be published later this month, with the Asian equivalent following in June. As in 2012, a world ranking of universities that are less than 50 years old will accompany the Asian exercise, underlining the growing status of the continent’s youngest institutions.
QS is the only organisation to publish bespoke regional rankings. A recent listing of Asian universities by Times Higher Education merely extracted the scores achieved by Asian institutions from the magazine’s 2012 world ranking.
The QS Latin American ranking will rate the region’s top 250 universities on seven key indicators, including the proportion of academic staff holding a PhD and the web impact achieved by each university. The longer-established Asian ranking will have two more measures and will include the numbers of research papers published and the volume of student exchanges at each university.
The use of different indicators to the world rankings and the exclusion of survey data from outside the region results in a different order to the global exercise. In Asia, for example, Hong Kong University of Science and Technology topped the 2012 regional ranking even though Singapore, Hong Kong and Tokyo universities were more highly rated globally.
Students and universities themselves will be waiting to see whether the huge investment in research by China is beginning to pay off in statistical terms and whether India will make a long-awaited breakthrough. In Latin America, the focus will be on whether Brazil can realise its international ambitions, beginning with whether the University of São Paulo can hold onto its slim lead over Chile’s Pontificia Universidad Católica.
Universities all over the world use agents to bring in international students. The good ones are valuable allies for both institutions and applicants. But there is no shortage of tales of agents who cost too much and produced too little, in extreme cases causing reputation damage to the university itself. Now QS and its partner in India, Manya, have got together to analyse just what makes for a good relationship between university and agent.
A QS whitepaper (available here) found that over 500 US universities use agents. So do their competitors in the UK, Australia and other major destination countries for foreign study. In Asia, where time zones as well as language and culture differ from the US, they are an especially tempting option. It is estimated that 60-80 per cent of Chinese and Indian students studying abroad come via an agent.
In an analysis of the use of agents for graduate student recruitment, QS and Manya point out that agents tend to enter the frame at a late stage, once students have formed a good idea of where they wish to go. At that point, a good one becomes a trusted adviser. So it is important for them to have detailed knowledge of the institution they represent, which they may well never have visited.
This means that one key to using an agent effectively is to help them be well-informed, with a manual on the university and up-to-date knowledge of its academic offerings as well as its admissions system and its administration. It is always worth letting agents know which courses are in most need of more people.
And try to remember that agents who seem pushy are just doing their job. A university takes an average of ten days to process an international student application. However, most students accept the first offer they get. So the agent is right to press for a quick decision. In the same way, it is only reasonable for them to ask universities for fee waivers, scholarships and other sweeteners to bring in the best students.
QS and Manya also find that even without these concessions, it probably costs $4-5,000 to recruit a student via an agent, half in the agent’s own fees and half in the form of extra travel, administration and student support. If this seems like a lot, it remains true that a good agent costs less than directly employed staff, and delivers more.
Finally,do not expect your agent to be an expert on high-level academic content. While they may give good advice on an appropriate Master’s course, they should never be involved in finding a PhD supervisor. Their role with PhD students should only begin once the student knows where they wish to go and who they want to work with.
Perhaps the most important development of recent years in international comparisons of universities has been the publication of subject rankings by QS.
The new edition published today is the most extensive yet, covering 30 different subjects. The rankings provide the only means available to prospective students of placing universities in order for their particular area of interest, rather than as whole institutions or broad faculty combinations.
A recent report on the impact of rankings by the European Universities Association said: “Comparisons between universities on a subject basis can be much more useful for them than global university league tables that try to encapsulate entire institutions in a single score.”
More than 2,500 universities were evaluated for the latest rankings, which for the first time include academics’ H Index in the calculations. A total of 678 universities feature in the top 200 for at least one subject.
The scoring system varies between subjects to allow for the different roles played by citations and the availability of other indicators. However, the main components are reputational surveys among academics and employers, and the research record of the university in the subject being ranked.
The leading institutions in the QS World University Rankings naturally dominate in many subjects – Harvard tops 10 of the 30 rankings – but the exercise also shines a light on centres of excellence in universities that do not reach the same heights in all disciplines. It also allows specialist institutions, such as Sweden’s Karolinska Institute in medicine, to demonstrate their quality.
After Harvard, the Massachusetts Institute of Technology (MIT)is the most successful university, finishing top in seven subjects. The University of California, Berkeley, and Oxford each topped the ranking in four subjects, Cambridge managed three and Imperial College London and the University of California Davis one.
Cambridge reached the top 10 in 27 of the 30 tables, the largest haul by any university. Oxford and UC Berkeley were next with 32, followed by Stanford with 22.
One more subject area has been added this year – agriculture and forestry, the discipline in which Davis (ranked 100 overall in the institutional table) triumphs. The specialist Wageningen University, from the Netherlands, is second for agriculture and Brazil’s Unicamp, the State University of Campinas, makes the top 20.
Ben Sowter, who is responsible for the rankings as head of the QS Information Unit, said: “Everyone talks about rankings, but QS started all this to help international students make smarter choices, and students tend to pick their subject before their university. Improving and extending these rankings by subject is central to our mission – expect to see more subjects and more universities evaluated in years to come.”
The full rankings for all 30 subjects are available at www.topuniversities.com/subject-rankings
Apart from some diehard cynics, everyone seems to agree that MOOCs are going to shake old-school higher education to its foundations. Or failing that, they are going to be a great marketing tool for prominent universities found near the top of the QS Rankings.
So earlier this year, I decided to test the waters by taking a MOOC. Having last been a student some decades ago (apart from the odd recreational evening class), I also reckoned the experience would reintroduce me gently to the world of formal learning.
But how to choose between the thousands of MOOCs on offer? Having written about European science for most of my career, I am now increasingly embroiled in China, so the Hong Kong University of Science and Technology course on Science, Technology and Society in China (Part 1, Basic Concepts), run by Naubahar Sharif, more or less chose itself. It had the advantage that it only took three weeks, so it was no big loss if it was terrible, while also leading into another more detailed course in the autumn. And HKUST is Asia’s top university, according to the QS University Rankings: Asia.
So how was it? Well, high-tech is not the word. The main content, three weekly batches of lectures, involves Sharif talking into a camera and showing some very unflashy PowerPoint. However, his style is good and he knows his stuff.
A worse problem is that the course has been badly misnamed. Week 1’s nine lecture segments did not touch on China except peripherally, being more of a canter round the science studies field (Kuhn, Popper and the like). In week 2, all the content was about China, and things got very interesting. But there was a heavy tilt towards innovation and no mention of any sort of science that does not lead to application. Week 3 was all about innovation, and only one of the seven segments was about China. It was really a course on innovation systems, with an emphasis on China.
However, to point to another much-discussed virtue of MOOCs, there did seem to be a pretty fair mutual support system, with a lot of email chat between students based all over the world, and with highly variable knowledge of English.
I certainly enjoyed rejoining the learning world, and by the end was even answering the quiz questions correctly. This involved relearning the basic principle of reading the question properly before answering it.
I also learned a lot from the marking phase of the MOOC. This involves everyone marking three other students’ assignments for each of the three weeks. Week 1 was about technological lockin, a subject on which I edited a big UK government report a few years ago. The range covered everything from McKinsey-level analyses of this complex issue to folk who had missed the concept completely.
Like other MOOCs, this one is no more than a taster for a real course, but a lot of thought has gone into it and it has clearly excited a lot of bright students. Will I show up for the next phase this autumn? I don’t quite know, but I have the feeling that I just might.
This year’s revamped QS World University Rankings by Subject have been expanded to cover a record 30 disciplines, offering students the most detailed comparison of the world’s top universities at individual discipline level.
Taking in responses from some 70,000 academic experts and graduate recruiters worldwide, they draw on the largest surveys of their kind. Academics identified the leading universities within their field and area of expertise, while employers named the universities that they regard as producing outstanding graduates in a given discipline.
This year our research citations indicator has been supplemented with a new ‘H-Index’, measuring research productivity and impact. The two measures in tandem help us to more accurately account for both the quality and quantity of a university’s research output in a given field.
Competition at the top
Across the 30 disciplines the number one spots are distributed among large US and UK institutions that operate primarily in English: Harvard (10), MIT (7), UC Berkeley (4), Oxford (4), Cambridge (3), Imperial College London (1) and UC Davis (1).
The 30 individual tables are not intended to combine to form an overall ranking, and indeed there is more than one way to interpret which university comes out on top if we attempt to do so.
While Harvard claims more top spots than any other institution, the university that appears in the top ten in most disciplines is University of Cambridge, with 27, ahead of Oxford and Berkeley on 23, with Stanford (22) and Harvard (21).
Cambridge’s near-blanket presence in the top ten indicates that, perhaps more than any other institution, it can claim to be world-class in nearly every major area of academic research. Yet Harvard and MIT have more departments that are truly world leading.
The view from employers
While US institutions remain preeminent for research, the rankings suggest that graduates from the UK’s two most famous institutions are more highly regarded than their Ivy League rivals by the world’s employers.
Employers regard Cambridge graduates as the world’s best in 13 of the 30 subjects, while Oxford ties with Harvard on seven, ahead of London School of Economics, University of Tokyo and UC Davis, top in one subject each.
The US/UK monopoly extends to nearly two-thirds of the elite positions – 397 of the 600 top-20 spots across the 30 disciplines. Yet there is plenty of evidence in these rankings of world-class departments outside of this traditional power cluster.
Asia excels in engineering
The rankings feature several notable performances from Asian universities, particularly in the hotly contested areas of science, engineering and technology.
Nine of the top 20 institutions in civil engineering are Asian, led by Japan’s University of Tokyo (3rd) and Kyoto University (7th), Singapore’s Nanyang Technological University (8=) and National University of Singapore (11), alongside three universities from Hong Kong and two from mainland China. The US and UK account for just five of the top 20.
“The shift in global economic power is transforming the international higher education landscape, with the likes of Hong Kong, Japan and Singapore emerging as genuine challengers to the traditional elite,” says QS head of research Ben Sowter. “Many institutions in Europe are struggling to keep pace in technical disciplines, in which financial resources are particularly crucial.”
The pace of change can is demonstrated by the rapid development of young Asian tech-focused institutions. Hong Kong University of Science and Technology and Nanyang Technological University have been in existence for just over 20 years, yet are now established in the global top 20 in several engineering and technical disciplines.
France and Germany feel the squeeze
France and Germany have both introduced ‘excellence initiatives’ to improve the performance of their top universities, and both can point to positive performances in some areas. Germany has five top-50 institutions for mechanical engineering, led by Rheinisch-WestfälischeTechnischeHochschule Aachen , and an impressive five institutions in the top 35 for physics – only the US can claim more.
France can also point to top-20 performances from three of its universities: Université Paris-Sorbonne (Paris IV) ranks 14th for modern languages, Sciences Po Paris is 16th for politics and international studies, and Université Paris 1 Panthéon-Sorbonne ranks 18th for law and 19th for history.
Yet the rankings also reveal areas in which both France and Germany are trailing in the wake of intensified global competition. Germany has no top-50 institutions in important areas such as mathematics and economics, while there are no French institutions in the top 50 in computer science or any of the four areas of engineering: chemical, civil, electrical and mechanical.
The increased competition that is squeezing some European institutions out of the global elite is coming not only from Asia, but also increasingly Australia. University of Melbourne makes the global top ten in six subjects, ahead of Australia National University on four, University of Queensland on two, and Monash University on one. Australian universities make the global top 20 in 25 of the 30 disciplines.
Mixed results for the BRIC nations
While Australia, Hong Kong, Singapore and Japan emerge as global players in several disciplines, the world’s major emerging economies see more mixed fortunes.
The rankings are positive for China, whose ambitious schemes to improve higher education standards in the last 20 years have yet to see its universities break the top 20 in the overall QS World university Rankings. Here however, there are Chinese universities in the top 20 in ten disciplines, with Tsinghua University ranking tenth in materials sciences and eleventh in statistics.
Brazil’s efforts to improve its research output have been less high profile, yet its universities have been steadily improving their international standing in recent years.Universidade de Sao Paulo in particular performs well here, ranking among the top 50 universities in the world in four disciplines. Brazil’s total of 19 top-200 universities in at least one of the 30 subjects compares to eight from Chile, five from Argentina, four from Mexico and two from Colombia.
Yet there are less encouraging signs from the remaining two BRIC nations, India and Russia. The Indian Institutes of Technology perform reasonably well in their specialist areas, with the IIT Bombay, IIT Delhi and IIT Madras all making the top 50 in at least one of the engineering disciplines. Yet there are 11 subjects in which not a single Indian institution makes the top 200.
The situation is worse in Russia, whose institutions feature in just eight of the 30 disciplines. The best performance comes from Lomonosov Moscow State University, which makes the top 50 in mathematics, a subject in which Russia has historically produced numerous world leaders.
A Reader’s Digest poll last month found that Taiwan offers some of the best value-for-money degrees in Asia. But political and demographic change may mean that the island’s higher education system will need a new economic model in years to come.
In a recent interview with Higher EducationWorld, Han-Sun (Vincent) Chiang, president of Fu Jen University in Taipei, pointed out that the university has 27,000 students on a campus built for 10,000. The reason in part is that university tuition fees are low, just a few thousand US dollars per year. In addition, there is little mainstream government funding for teaching or research. And because higher education is politically important, with 90 per cent of high school graduates going on to university, fees are held down by central government.
Chiang, a genial medic whose big project is the construction of a 1,000-bed university hospital, says that it is impossible to grow Fu Jen’s student numbers any more. Indeed, it is getting harder to find students as the number of school-leavers falls. Instead, he wants to admit about 5,000 students a year instead of the current 7,000. This would take stress off the university and allow it to enhance student quality.
To make up the financial gap, Chiang plans on plunging more deeply into the international student market. This chiefly means mainland China, although here too there is a political problem. Fearful of “cross-straits” influences, the government limits the number of mainland students allowed on the island.
Another approach is to up the number of postgraduates at Fu Jen. Chiang is especially keen on MBAs, as they pay well and can become generous alumni. But here too there is a political problem. Taiwanese law gives a much better tax break for donations to state universities – led by National Taiwan University, 80 in the QS World University Rankings – than to private institutions such as Fu Jen. The representative body for the private universities is lobbying on this anomaly.
But Chiang says that Fu Jen will not be joining in the possible mergers mooted between some more modest Taiwanese universities. Instead, he prefers to use the institution’s Catholic connections to build links to smaller Catholic universities in the island and to the many large Catholic universities around the world.
● A longer version of this interview will appear in the QS Showcase magazine later this year.
QS is about to publish the World University Rankings by Subject for the third time. They will be more comprehensive and detailed than ever.
The 2013 subject rankings will include a new subject, agriculture and forestry. Growing populations and changing dietary demands mean that this ancient human concern has never been more topical. We are sure you will want to know the top universities around the world for research and teaching in this area.
The addition of agriculture will bring the total number of subjects we cover to 30. Between them they cover the vast bulk of academic activity, whether in terms of teaching and student numbers, or of research.
Over the past year we have also looked at a range of other possible subjects for inclusion. However, agriculture is the only one for which we felt we had the data needed to provide a reliable outcome.
In addition to a new subject, we are amending the subject rankings by adding a new indicator.
In their first two years, we drew up the rankings on the basis of three measures: citations data, academic opinion and employer opinion. The weightings of the three were subject to “variable geometry.” In some subjects, for example, citations are more important than in others, and in these they would account for a higher share of a university’s possible score.
We are now adding a new measure to these three in the shape of the H-index. Readers of Higher Education World probably know all about this indicator, invented in 2005 by the physicist Jorge Hirsch, a professor at the University of California, San Diego. But if not, here is an article on the matter by Alex Bateman of the Wellcome Trust.
The H-index for an individual, or in our case for a department, combines the number of papers they have generated and the number of times the papers have been cited. So it rewards both quality and quantity. By contrast, our other citation measure is prone to being skewed by a small number of highly-cited papers. Our analysis shows that it correlates well with academic and employer opinion of university achievement in specific subjects.
Don’t panic just yet. Despite everything you have heard about Massive Open Online Courses, your university will probably survive the current MOOC frenzy.
A meeting earlier this month at the University of London drew a lively audience of about 150 people to debate the MOOC phenomenon. Run by the University itself, which has been delivering distance learning since 1858, the UK’s Leadership Foundation for Higher Education, and the Observatory on Borderless Higher Education (OBHE), it heard that universities have several reasons for giving away their precious intellectual property online.
The first, as delegates heard from Bill Lawton of OBHE, is that MOOCs are rapidly starting to make money for universities. This mainly happens when the course is validated for academic credit. The result will probably be a Freemium model, like the one popular in the software industry, in which a cut-down version is free but the full item comes at a cost. There are already plenty of cases of a MOOC forming part of a degree course, and therefore involving payment for exams and marking. For example, the University of Texas MOOC offering, via provider EdX, forms part of its planned $10,000 degree, while there is also academic credit for MOOCs from four German universities via Udacity.
Many MOOCs are offered by big-name US universities that are happy to regard them as taster sessions for their highly-priced courses. However, surveys show that about 80 per cent of people taking them already have a degree, often a higher degree. This suggests that a typical MOOC user might well go on to buy a short course, for interest or for professional development, rather than a full degree.
But from the point of view of the university, the difference between a MOOC and conventional university attendance may well be the data it produces. Because the student’s every keystroke is recorded and analysed, it is possible to track student progress, find out where material is too simple or too complex, and see what sort of learning works best for which student. MOOC providers routinely use keystroke patterns to determine whether students are taking their exams or have brought in a substitute to provide the answers. MOOC entrepreneurs are already selling this data, which universities need to improve their provision.
Edwin Eisendrath of Huron Consulting Group in Chicago told the conference that the ability to analyse student learning as it is happening opens up fascinating possibilities for university management. At the moment, we accept that it is possible to measure research outputs, which are the principal driver of academic promotion. But in future, it might be possible to quantify teaching success, which could also be used in promotion decisions.
Tim Gore, director of global networks and communities at the University of London, told the conference that MOOCs need to be seen in context. Their interactive format suits Generation Y, whose members are more serious about peer approval than about praise from their elders. They have also arrived at a time of growing scepticism over the cost of conventional university study.
In future, MOOCs are also likely to be used as a test-bed for teaching innovation, for example by building in games as a serious teaching tool and by incorporating automated cues for students to complete projects. Diana Laurillard of the University of London’s Institute of Education said that they might well contain tested elements common to a range of subjects rather than being assembled on a craft basis like today’s university courses. But this still suggests that MOOCs are going to become a new form of short course, often with an emphasis on professional development, not a direct rival to full-scale university provision.
The European Union’s new €2 million ranking system for universities has run into trouble almost as soon as it was launched.
The U-Multirank scheme, which is designed to correct a perceived overemphasis on research in existing international rankings, had its official launch in Dublin late last month. Androulla Vassiliou, the European commissioner for education, said it would give students and institutions a clear picture of their performance across a range of important areas.
But some of the continent’s most prestigious universities have already said they will boycott the project, which relies on institutions to submit data. The League of European Research Universities, which represents 21 leading research-intensive universities, including Oxford and Cambridge, has described the project as “at best an unjustifiable use of taxpayers’ money and at worst a serious threat to a healthy higher education system.”
Kurt Deketelaere, secretary-general of LERU, said the organisation had serious concerns about the lack of reliable data for the indicators to be used in U-Multirank, as well as about the comparability between countries and the burden put upon universities to collect data.
U-Multirank will rate universities according to their research reputation, teaching quality, international orientation, success in knowledge transfer and regional engagement. There will be contributory indicators in each area, including graduation and employment rates, and, for knowledge transfer, the number of patents registered and companies started.
All the data will be supplied or checked by universities themselves. The Commission hopes to persuade at least 500 universities to opt into the first phase of the system. Most will be from Europe, with a small number of international institutions included for comparison. The first ranking, which will not be in the form of a conventional league table, is scheduled for early 2014.
The system was devised in collaboration with the Centre for Higher Education Development, in Germany, and the Centre for Higher Education Policy Studies, at the University of Twente, in the Netherlands. The EU Commission insisted that a feasibility study undertaken with 150 universities had shown that the concept of a multi-dimensional ranking was realistic.
The project has been allocated €2 million for 2013-14 from the EU education programme, with the possibility of a two-year extension if necessary to establish the system. After that, it must become financially independent.
U-Multirank has met with scepticism in the UK. David Willetts, the Universities and Science Minister, told a House of Lords Select Committee that the project “could be viewed as ‘an attempt by the EU Commission to fix a set of rankings in which [European universities] do better than they appear to do in the conventional rankings”. The committee said the project might be a waste of taxpayers’ money and advised the Commission to prioritise other activities.