Making sense of university rankings

The 2017 Good Universities Guide was released on Monday 29 August 2016 and, in it, prospective students will find a very different list of Australia’s top universities. 

The Guide singles out Charles Darwin, Central Queensland University, Charles Sturt University and the University of New England as top performers.

But in the various global rankings systems, it’s the University of Melbourne and other Group of Eight members that usually get the nod.

So with many unis claiming to be the best in some obscure ranking or another, how do you  really  understand what’s going on?

Do the rankings matter?

Yes. But probably not how you think.

The unis take the rankings very seriously. They have staff with PhDs focused entirely on dealing with what they call performance management or external benchmarks. This is in large part because a large share of their income depends on them, since international students prioritise world rankings when choosing an institution.

“Between 70 and 80 percent of international students are now using rankings as their top decision-making factor,” says Tracey McNicol, from ANU’s Planning and Performance Management division. “So every uni is clamouring to be high up in the rankings.”

Prestige in the rankings tends to translate broadly into student demand, which gives unis resources to attract and retain higher quality staff and facilities too.

Still, the Herald spoke to experts in rankings at several universities and all agreed you should take rankings with a grain of salt.

Understand what they are measuring

“If you’re a student, you need to figure out what the intent of the ranking is and what it is they are measuring, and how it applies to you,” says Dr Natalie Mast, the associate director for performance analytics at the University of Western Australia.

If one university shoots up 30 places in a list, for example, it’s more likely to be as a result of a change in the rankings methodology than some rapid boost in quality.

The global rankings put more emphasis on the number of superstar academics at a uni instead of, say, graduate employment rates, which might be more important to you.

And they struggle to accurately measure things like teaching quality. So just because a uni ranks in the top 100, you may not necessarily get a great learning experience.

But Professor Merlin Crossley of UNSW says “to ignore the rankings, or deny their impact is a mistake. Definitely consider the data.”

Where can I find information about Australian unis?

Top performers in the Good Universities Guide

Source: The Good Universities Guide 2017. *Median salary of graduates from QILT data. **Proportion of graduates in full-time work 4 months after graduation. 

This covers Australia only and has been criticised in the past for its star system which some unis complain is misleading.

But unlike the global rankings, it is geared at judging unis by the student experience they offer, not by the quality of research output.

And it doesn’t provide an overall ranking, but rather rates Australia’s 39 universities in a series of categories – quantitative, such as salaries and graduate employment rates, and qualitative and survey based, like “overall education experience”.

Ross White, Head of Product at the Guide, says it presents a “holistic view”.

“A student may gain a high-level view of a university’s global reputation or research quality from the ARWU or the Times,” he says, “yet unlike in The Good Universities Guide, they will not gain specific insight into indicators such as the student experience, graduate outcomes or the characteristics of the student cohort.”

The Group of Eight, research-intensive universities which dominate the global rankings, don’t do that well in the Guide.

But the University of Notre Dame, for example, a small private institution in Sydney and Perth which has a non-ATAR entry system and no global research reputation, rates highly.

Professor Hayden Ramsay, Senior Deputy Vice Chancellor at the Sydney campus says “In its early years Notre Dame made a strategic decision to build on teaching and learning excellence, focus on the individual learner’s needs and personal contact with lecturers in face to face teaching.

“Research rankings do not focus on the undergraduate student experience which is integral to our mission and to our students’ experience and success.”

The Guide has its detractors, with Central Queensland University last year saying its methodology was “flawed” and “sensationalist”.

“It graded universities into bands into groupings,” explains Tony Sheil, rankings expert at Griffith University. “Therefore only eight universities could receive five stars, eight four stars and so on. In research terms this means that a well ranked university on the ARWU would by virtue of the method used be rated as three stars in the Guide.

“The bluntness of the star rating system gives the appearance of a wide degree of separation of performance against a given indicator, when in fact there might be very little difference.”


This article originally appeared in the Sydney Morning Herald – Making sense of university rankings

Leave A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.