Usher sobre rankings (estúpidos)
Febrero 13, 2014

Posted on February 6, 2014 by Alex Usher

You may have noted the gradual proliferation of rankings at the Times Higher Education over the last few years.  First theWorld University Rankings, then the World Reputation Rankings (a recycling of reputation survey data from the World Rankings), then the “100 under 50” (World Rankings, restricted to institutions founded since the early 60s, with a methodological twist to make the results less ridiculous), then the “BRICS Rankings” (World Rankings results, with developed countries excluded, and similar methodological twists).

Between actual rankings, the Times Higher staff can pull stuff out of the database, and turn small bits of analysis into stories.   For instance, last week, the THE came out with a list of the “100 most international” universities in the world.  You can see the results here.  Harmless stuff, in a sense – all they’ve done is take the data from the World University Rankings on international students, foreign faculty, and international research collaborations, and turned it into its own standalone list.  And of course, using those kinds of metrics, geographic and political realities mean that European universities – especially those from the really tiny countries – always come out first (Singapore and Hong Kong do okay, too, for similar reasons).

But when their editors start tweeting stuff – presumably as clickbait – about how shocking it is that only ONE American university (MIT, if it matters to you) makes the top 100 – you have to wonder if they’ve started drinking their own Kool-Aid.  Read that list of 100 again, take a look at who’s on the list, and think about who’s not.  Taken literally, the THE is saying that places like the University of Ireland, Maynooth, the University of Tasmania, and King Abdulaziz University are more international than Harvard, Yale, and Stanford.

Here’s the thing about rankings: there’s no way to do validity testing other than what I call the, “fall-down-laughing test”.  Like all indicator-systems, they are meant to proxy reality, rather than represent it absolutely.  But since there’s no independent standard of “excellence” or “internationalization” in universities, the only way you can determine whether or not the indicators and their associated weights actually “work” is by testing them in the real word, and seeing if they look “mostly right” to the people who will use them.  In most international ranking systems (including the THE), this means ensuring that either Harvard or Stanford comes first: if your rankings come up with, say, Tufts, or Oslo, or something as #1, it fails the fall-down-laughing test, because “everybody knows” Harvard and Stanford are 1-2.

The THE’s ranking on “international schools” comprehensively fails the fall-down-laughing test. In no world would sane academics agree that Abdulaziz and Maynooth are more international than Harvard.  The only way one could possibly believe this is if you’ve reached the point where you believe that specifically chosen indicators actually *are* reality, rather than proxies for it.  The Times Higher has apparently now gone down that particular rabbit hole.

0 Comments

Submit a Comment

Tu dirección de correo electrónico no será publicada. Los campos requeridos están marcados *

PUBLICACIONES

Libros

Capítulos de libros

Artículos académicos

Columnas de opinión

Comentarios críticos

Entrevistas

Presentaciones y cursos

Actividades

Documentos de interés

Google académico

DESTACADOS DE PORTADA

Artículos relacionados

Share This