Don’t Go by Numbers
(June 3rd, 2014) Most global university rankings are based on debatable criteria. Such a system not only fails to capture the diversity in institutional profiles, it also tends to favour American universities. Thus, the European Commission came up with a more personalised approach.
Choosing your university is an arduous task. The numbers on university rankings make comparisons easy but our distinct priorities influence our decision for or against an institution. While an undergraduate might look at university’s teaching standards or the percentage of graduate students, postdocs might be interested in sources of research funding, and foreign applicants might want to check out the university’s international reach. However, some of the major global university ranking systems, including the Academic Ranking of World Universities of the Shanghai University and the Times Higher Education World University Rankings, compile their scores solely based on international reputation or bibliometric data such as the number of research citations and academic laurels. Such “inadequate indicators” fail to accommodate the diversity of institutional profiles.
The European Commission has recently launched U-Multirank, an online tool for comparing universities worldwide. U-Multirank is a joint effort of the Centre for Higher Education Policy Studies in the Netherlands and the Centre for Higher Education in Germany. It has been widely touted as a “multidimensional” and “user-driven” ranking portal that compares universities across five dimensions: teaching and learning, research, knowledge transfer, international orientation and local engagement. While still reporting bibliometric data like other rankings, U-Multirank in addition includes “self-reported” data collected from about 500 participating universities and student surveys, data on interdisciplinary publications and collaborations with industry. What’s nice about U-Multirank is that it compares “like with like” and eliminates dissimilar profiles of institutions from obscuring the rankings. For instance, as project leader Frank Ziegele told Science, “There might be a university that has no 'A' (grade) for internalization because it serves primarily a local or national audience. This is perfectly fine. This university fulfills an important function for society.” Besides whole university comparisons, U-Multirank also provides field-wise rankings for physics, business, electrical and mechanical engineering with more subject areas to be added over the coming years.
This system uncovers the assets of European universities, which have been overshadowed by American institutions for a long time. The latter for example “are absolutely on top in terms of citation rates and other classical criteria” but, as per U-Multirank, “less renowned institutions, often from Europe emerge at the top of the list in other categories, such as international publications and publications with industry”, remarks Ziegele. Despite its aim to achieve fairness in the system, U-Multirank still faces a lot of criticism. Firstly, the participating universities are mostly European. Secondly, imbibing more universities into the system can cause inconsistency, let alone unreliability, of self-reported data that U-Multirank relies on. For similar reasons, the project has failed to win the support of the British higher education establishment. Moreover, some critics believe that the simplicity of traditional “league table” rankings, no matter how inaccurate, will continue to appeal to consumers.
It does seem like an ambitious call by the leaders of U-Multirank to try and collect the finest details of institutional profiles and consolidate the ratings in a standardised fashion, especially given the large number – 20,000 or so – of higher educational institutions in the world. Nevertheless, the idea of a system that captures “nuanced areas of performance” is welcome among educationists.