Make decisions based on evidence, not on university rankings

Rankings need to focus more on educational quality rather than prestige and reputation

Gauging performance through the lens of the top-ranked universities skews our understanding of national performance and educational quality, and creates perverse benchmarks. Photograph: Getty Images

It is rankings season again. The latest Times Higher Education rankings is likely to send governments and institutions, in Ireland and around the world, into a new state of frenzy.

Rankings are a successful and lucrative business model. Both US News & World Report, and Times Higher Education have been transformed into vehicles for rankings.

Before the web and social media, the former was a well-regarded news magazine. In 1983, it created its first ranking. Today, it receives an average of 30 million site visits per month, for its rankings on education, health, travel, real estate, and so on.

Times Higher Education still considers itself an independent newspaper but it is often difficult to distinguish between that role and being a marketing agent for its more lucrative rankings. QS is first and foremost a commercial company. Even the Academic Ranking of World Universities (ARWU), from Shanghai Jiao Tong University, has been spun-off to form the Shanghai Consulting Company.

READ MORE

It is important to keep these facts in mind when asking whether rankings tell us anything meaningful – and whether they provide the appropriate benchmark for Ireland.

Is there a correlation between rankings and funding?

Much is made about the correlation between rankings and funding. Correspondingly, concern is often raised about the effect significant reductions in funding can have on the position of universities in the rankings.

There is no denying the fact that rankings measure wealth – whether garnered because of institutional age, endowments, tuition or government investment. The top-ranked universities are a good example of this phenomenon.

For example, tuition fees at Harvard are more than €40,000 per annum while the university has an endowment of more than €33 billion. Stanford’s fees are also more than €40,000 per annum, with an endowment of almost €20 billion. Massachusetts Institute of Technology – ranked number one by QS in 2016 – has tuition fees of almost €36,000 per annum and an endowment of more than €10 billion.

Wealth gap

Over the years, the gap in wealth is widening. But gauging performance through the lens of the top-ranked universities skews our understanding of national performance and educational quality, and creates perverse benchmarks.

Ireland has 21 universities and institutes of technology. On average, about five of our institutions appear in the top-500 of the three main rankings, equivalent to 22 per cent. In comparison, only 6 per cent of US institutions appear in these rankings.

Ireland’s top-performing institutions enrol 40 per cent of total Irish higher education students; in stark contrast, US top universities enrol only a tiny minority of total students. Many countries now surpass the United States in the percentage of 25-34-year-olds with a bachelor’s degree.

Do rankings measure quality?

Rankings purport to tell us something important about quality. But do they?

Fully 70 per cent of QS indicators focus on research. These include academic reputation, PhD awards, research income, citations, and so on. Times Higher Education rankings allocate more than 90 per cent, while ARWU devotes 100 per cent.

QS attempts to measure educational quality by using the staff-student ratio. However, international research consistently demonstrates that the quality of teaching is far more important for learning outcomes and student achievement than class size. In reality, many top professors may never teach, or they may be terrible teachers or have little or no interest in their students, or students may be disengaged.

Teaching ability

Times Higher Education uses a reputational survey. But, it is unclear how anyone can genuinely rate someone’s teaching ability without being in the classroom.

Finally, ARWU includes a category for educational quality, but instead measures alumni and academic staff winning Nobel and other prizes.

Can rankings be gamed?

Universities in Ireland and elsewhere are accused of contacting peers around the world asking them to respond positively to reputational survey requests. But, this controversy hides the real problem with such surveys.

Reputational surveys rely on a very small response rate, and are highly prone to “rater” bias. This occurs because respondents’ information is usually limited and answers rely on what they remember or think they remember. For example, Princeton was reputed to have one of the best law schools in the US even though it didn’t have a law school.

The staff-student ratio presents a different problem. There are various methods to classify academic staff according to whether they are full-time or part-time or combine teaching with research.

However, the real cause of annual fluctuations in ranked performance is due to continual methodological changes. Relatively simple changes can produce significant results – creating greater publicity value for the rankings themselves. Allegations of “gaming” simply deflect attention away from the big problems with rankings.

The problem with rankings is that they make pursuit of prestige and reputation the prime drivers of higher education rather than educational quality. As discussion ensues about the future funding of higher education, let’s ensure our decisions are based on real evidence and not on rankings.

Ellen Hazelkorn is author, Rankings and the Reshaping of Higher Education. The Battle for World-class Excellence, 2nd edition. Palgrave Macmillan, 2015.