Tuesday, 7 October 2014

DO NOT Trust Universities Rankings

“Useless” - Official Study Slams THE University Rankings

THE World University Rankings Decomposed and Discredited
(Excerpts of Study discussed here)

Singapore NUS and NTU were recently ranked at the Top by major international World Universities Rankings vendors.  One of them is THE (Times Higher Education) World University Rankings.

A Norway government-commissioned study has concluded that even the top rankings are so based on subjective weightings of factors and on dubious data that they are useless as a basis for information if the goal is to improve higher education.

The Norwegian Ministry of Education and Research commissioned the Nordic Institute for Studies in Innovation, Research and Education, or NIFU, to analyse Norwegian universities’ placements on international university rankings.

The ministry specifically wanted to know what the rankings meant for the universities in practice, and if there were factors at the national or institutional level that could explain the differences between Nordic countries.

Main Conclusions
The main conclusions regarding the ARWU (Shanghai Academic Ranking of World Universities) and THE (Times Higher Education) are that “placement on those rankings is to a large degree based on a subjective weighting of factors which often have a weak relationship to the quality of education and research.

“The rankings are based on data that to a varying degree are made available and made transparent. The rankings say almost nothing about education.

“The international rankings are therefore not useful as the basis for information and feedback both on research and education, if the goal is further improvement of … higher education institutions.”

“Decomposing” the THE Rankings
The NIFU Methodology in “decompose” the ARWU and THE rankings is extensive.

There is a sophisticated analysis for each university, looking at which variables can explain most of the variance measured in percentage from the positioning on the same variable for the benchmarking group of universities.

This is a very illuminating exercise, because the standardised measures – for instance in THE – differentiate much better among the top-rated universities than those with a lower ranking. This
THE methodological ‘fallacy’ is underlined several times in the report:

“In THE there is a 30.7 point difference between Caltech as number one and Pennsylvania State University at place 50. And for Helsinki University at place 100, there is only a 10.9 point difference. Then there is only a 3.8 point difference between rank 101 and rank 149 and another 4.2 points between rank no 150 and rank 199.

“The trend in both ARWU and THE is that the lower down the list you get, the smaller the difference between universities.”

What is special with the THE, NIFU argues, is that 33% of the weighting in the ranking is decided by an international survey of academics – but these results are not made available in the THE report, where only the first 50 places are documented.

The last THE reputation survey was done in 2012, NIFU said, and 16,639 academics in 144 countries responded – but THE does not say what was percentage response rate to the survey.

The Harvard Example (“Always Scoring 100)
The Institute says that Harvard University, which is most frequently rated by the respondents, is given a score of 100, and universities down on the list are given a percentage of the “votes” Harvard gets. For instance MIT, second on the list, gets 87.6% as many “votes” as Harvard.

This figure is only published for the first 50 entries on the ranking. Most universities after that might have received less than 1% of the nominations Harvard got, making it feasible that there may be great fluctuation in the proportions between universities from survey to survey.

The Institute argues convincingly that the weight of the survey is an advantage only for the first placement on the rank, and
unreliable as a differentiation mechanism for universities further down on the list.

THE Responds
Phil Baty, editor of the THE rankings, told University World News that the magazine had “not been consulted at all by the NIFU”.

Reputation, he said, formed “just a part of a comprehensive range of metrics used to create the Times Higher Education World University Rankings". In all, 13 performance indicators were used, “covering the full range of university activities – research, knowledge transfer, international outlook and the teaching environment".

“The majority of the indicators are hard, objective measures, but we feel it is very important to include an element of subjective reputation as it helps to capture the less tangible but important aspects of a university's performance, which are not well captured by hard data.”

However, “hard, objective measures” are NOT the same as Valid and Reliable measures of Excellence.

The methodology was devised, Baty stressed, after open consultation and was refined by an expert group of more than 50 leading scholars and administrators around the world. Effort was taken to ensure the survey was fair, with countries receiving the right proportion of questionnaires, and it was distributed in multiple languages. Only senior academics were invited to respond to the survey, and all of them had published in world-leading journals.

Again THE Ranking Methodology has to be scientifically established with regards to its 13 Vectors of THE Measures and their respective Internal Consistency. It is NOT a matter of collective educated opinion by a group of unnamed and unspecified Scholars and Administrators of unknown expertise.

“When Norway’s universities break new ground and push forward the boundaries of understanding in any particular academic field, they should be making sure that scholars across the world are aware of the discoveries, through the most appropriate means of dissemination – journal publications and conferences,” said Baty.

“This is the only way to ensure Norwegian universities get the credit they deserve. Other small countries have had tremendous success in the rankings – the Netherlands and Switzerland, for example.”

Local views
The NIFU report was presented at a seminar in Oslo recently, capturing much attention. “Can we trust university rankings?” NIFU wrote on its website. “University rankings criticised,” declared the Ministry of Education and Research in a press release.

“A Kiss of Death for university rankings,” said University of Oslo Rector Ole Petter Ottersen in his blog, stating:

“This report should be made available for everyone working within the higher education sector in Norway. Not the least, it should be available on the news desk of Norwegian newspapers.”

The United Nations Education agency, UNESCO, has challenged the validity and reliability of University Rankings such as THE Ranking:

“Global university rankings fail to capture either the meaning or driverse qualities of a university or the characteristics of universities in a way that values and respects their educational and social purposes, missions and goals. At present, these rankings are of dubious value, are underpinned by questionable social science, arbitrarily privilege particular indicators, and use shallow proxies as correlates of quality.”
                   
For the sake of Authenticity, Singapore universities should stay away from bogus ranking standards of dubious excellence.


Read more:



  

8 comments:

  1. Mike, The biggest LIE and the most pervasive assumption in all these attempted measures is an unexamined faith in something called objectivity in social analyses sucvch as these attempted rankings. There is a constant hankering after the kind of objectivity that hard science seems to be based upon. Its been called `Physics envy`. Objectivity is not possible in social analyses and even in theoretical physics Heisenberg has shown how our observations materially affect the phenomenon observed. So called `facts` are always part-fabrication (Latin facere); we construct what we find.

    ReplyDelete
    Replies
    1. Indeed sad to see fellow NTU Professors, so many pretty esteemed in their fields, falling for a bogus standard of dubious excellence, Robert. I guess a desperation for that elusive "ultimate" recognition easily compromises the integrity and authenticity values held so inviolable by True Academia. We in fact demand more vigorous validity and reliability proof from our Masters and PhD students, imagine that! I thinks Singapore's reputation for Truth and Authenticity has been hijacked to lend credibility to the bogus Universities Rankings.

      Delete
  2. Bro, this is a good write-up. Are you ok if we repost this on TRE? Thx!

    ReplyDelete
  3. Hi, can I ask for your permission to reproduce this post on our education portal http://www.domainofexperts.com? Explicit mention shall be made of the fact it first appeared on your site, and we shall cite Michael Heng as the author. Hope to hear from you again :)

    ReplyDelete
    Replies
    1. Yes, you can.
      Sorry for this belated reply.
      Tkx.

      Michael


      Delete
  4. Thinking about higher education in singapore , study in singapore study in singapore ,mba in singapore, graduation in singapore contact Abroad education consultants , Gurgaon at 8802888895 for expert guidance on studying in singapore. Visit us : http://www.ascoverseas.com/study-in-singapore

    ReplyDelete
  5. Other small countries have had tremendous success in the rankings the Netherlands and Switzerland for example. best essay writing service canada

    ReplyDelete