Those Times Higher Education World Rankings (2013-14 Edition)
So, yesterday saw the release of the latest round of the THE rankings. They were a bit of a damp squib, and not just for Canadian schools (which really didn’t move all that much). The problem with actually having a stable methodology, as the Times is finding, is that there isn’t much movement at the top from year-to-year. So, for instance, this year’s top ten is unchanged from last year’s, with only some minor swapping of places.
(On the subject of the THE’s top ten: am I the only one who finds it completely and utterly preposterous that Harvard’s score for industry funding per professor is less than half of what it is at Beijing and Basel? Certainly, it leads to suspicions that not everyone is filling out their forms using the same definitions.)
The narrative of last year’s THE rankings was “the rise of Asia” because of some good results from places like Korea and China. This year, though, that story wasn’t tenable. Yes, places like NUS did well (up 5 places to 25th); but the University of Hong Kong was down 8 spots to 43rd, and Korea’s Postech was down 10 spots to 60th. And no other region obviously “won”, either. But that didn’t stop the THE from imposing a geographic narrative on the results, with Phil Baty claiming that European flagship universities were “listing” – which is only true if you ignore Scandinavia and the UK, and see things like Leuven finishing 61st, as opposed to 58th, as significant rather than as statistical noise.
This brings us to the University of Basel story. The THE doesn’t make a big deal out of it, but a university jumping from 134th to 68th says a lot. And not about the University of Basel. That the entirety of its jump can be attributed to changes in its scores on teaching and research – both of which are largely based on survey results – suggests that there’s been some weirdness in the pattern of survey responses. All the other big movers in the top 100 (i.e. Paris Sud, U Zurich, and Lund, who fell 22, 31, and 41 places, respectively) also had huge changes in exactly these two categories.
So what’s going on here? The obvious suspicion is that there were fewer French and Swiss respondents this year, thus leading to fewer positive responses for those schools. But since the THE is cagey about disclosing details on survey response patterns, it’s hard to tell if this was the case.
And this brings us to what should really be the lead story about these rankings: for an outfit that bleats about transparency, too much of what drives the final score is opaque. That needs to change.