Saturday 22 October 2016

Ups and downs of university rankings tell us little about third-level quality

James McDermott

Published 21/09/2016 | 02:30

'I am not convinced that the rankings serve any useful purpose or that they deserve the level of media coverage that they seem to attract.' Photo: Depositphotos
'I am not convinced that the rankings serve any useful purpose or that they deserve the level of media coverage that they seem to attract.' Photo: Depositphotos

It is that time of the year, when various rankings are published for universities. We do not need a formal ranking system to know that Irish universities will never be able to compete with the resources available to their counterparts in the United States.

  • Go To

Indeed, on receiving the Nobel Prize for literature, one Irish recipient is reported to have said he hoped he would at least be allocated a space in the Harvard University Nobel Prizewinners' car park.

I am not convinced that the rankings serve any useful purpose or that they deserve the level of media coverage that they seem to attract.

There is no doubt that anything based on rankings always makes for a good story, since someone will be moving up and someone will be moving down.

But do rankings really tell us anything about the state of our third-level education system?

The most important criteria in the QS World University Rankings is stated to be 'reputation', which is an inherently vague concept.

Of particular importance is 'academic reputation' in which scholars are asked to identify the institutions producing the best work in their field of expertise in what is, in effect, a "peer review" exercise.

Now there are approximately 7,500 universities in the world, of which QS ranks the top 900 or so.

Bearing in mind that even the most assiduous academic networker will have personal experience of only a very small fraction of these institutions, then scoring highly in this category will usually come down to simple name recognition, which often has nothing to do with academic accomplishment.

For example, Georgia Tech and Boston College have both enjoyed a huge boost to their reputation in Ireland recently arising out of their meeting in the College Football Classic in the Aviva Stadium.

In addition, as individuals nominate themselves to take part in the process rather than being nominated, there is nothing to prevent people from attempting to game the system.

This caused controversy in 2013, when the president of UCC emailed academic staff requesting they contact three international academics to ask them to register with QS.

He attached a draft text, adding: "It is essential that the academics you contact understand the importance of UCC improving its university world ranking."

The second key QS performance indicator are citations, which are used to measure the influence of a piece of research; the idea being that the more often a piece of research is cited, the more influential it must be.

But again, this appears a somewhat simplistic way of assessing the impact of a piece of research.

For example, in the legal field an article on human rights abuses in Guantanamo Bay is open to being referenced in a huge variety of legal, political and international relations journals and is therefore likely to generate far more citations than an article on some obscure aspect of the Brehon Laws in ancient Ireland.

Creating a direct link between references and rankings has the unfortunate side effect of encouraging citation-chasing scholars to concentrate their efforts on high-profile topics that are, by definition, the least in need of further original research.

The third key criteria is student-to-faculty ratio, which attempts to identify which universities provide small class sizes and a high level of individual supervision.

Now, even allowing for the absence of any agreed international standard to measure teaching quality, this seems like a crude method of assessment.

It is doubtful if Michelin would retain its status as the world leader in restaurant ranking if it awarded its stars based on the ratio of servers to customers in a restaurant rather than what the food actually tasted like.

The rankings can also be criticised for what they omit as well as what they include.

For example, no credit is given to universities that develop Access Programmes to help attract students from disadvantaged backgrounds.

We are always told that these rankings are important as they will be consulted by students in deciding which university to attend.

Now, I have assisted at various open days and have conducted numerous school visits over the years, but I have never had a single conversation about university rankings with any prospective student or their parents.

I am not aware of anyone consulting the QS World University Rankings before deciding where to study.

Concerns such as location, availability of accommodation, advice from teachers and what your friends are doing are all matters that are far more likely to affect that choice.

This is not to say that the rankings do not serve one useful purpose.

A plunge in the rankings of all but one of our universities this year certainly acts as a good excuse to remind the Government of the impossibility of maintaining standards when, over the last seven years, exchequer funding has decreased by 28pc, while at the same time student numbers have increased by 18pc.

This year things were taken a stage further when, on the day the rankings were released, the president of UCD and the provost of Trinity issued a joint statement calling for government action to address the funding crisis in higher education.

Until this funding crisis is addressed, there will be no need for any Irish university to seek planning permission for a Nobel Prizewinners' car park anytime soon.

Irish Independent

Read More

Promoted articles

Don't Miss

Editor's Choice