El Programa de Institutional Management in Higher Education (IMHE) de la OECD ha puesto en circulación su Boletín de abril 2009, que aborda temas relativos a la calidad de la enseñanza.
Bajar el Boletínaquí 290 KB.
Más abajo, ver un artículo de la revista The Observer de la OECD que trata de los rankings de calidad universitaria y de las maneras de medir dicha calidad.
Breaking ranks
The Observer, Nº 269, octubre 2009
University league tables are fashionable, but should not be taken as accurate measures of the quality of education. The OECD is investigating other tools to measure performance, policymakers and educators heard at a recent conference.
Rumour has it that the two biggest fans of university league tables are vice-chancellors and mothers in China. Do league tables accurately measure what they claim to measure? Or are they, as their critics claim, simplistic and damaging?
Excellence is the quarry. The trouble is, excellence only seems to appear in the top ranks. Surely no university can be “world class” in every aspect. Franz Van Vught of the European Centre for Strategic Management of Universities put the problem succinctly at a recent conference called “Higher education: quality, relevance and impact” held at the OECD in Paris (see references). He said that if only 3% of universities out of 17,000 worldwide can be considered “world class”, it cannot mean that the rest are failures.
Exhortations to become world class have tucked universities into a Procrustean bed of indicators. Presidents anxiously cut back programmes, reorient their university’s mission, swell application numbers to tighten student selectivity, and seek mergers with higher-ranking institutions–conversely, those higher up jealously guard their hard-won reputations and shy away from collaborating with anyone but their peers. Deep excisions may be made into the social sciences and humanities to leave more room for the natural sciences and research. Speaking at the biennial conference of the OECD’s Programme on Institutional Management in Higher Education (IMHE) in September, Ellen Hazelkorn of the Dublin Institute of Technology, cited a respondent to an international survey which said that “the easiest way to boost rankings is to kill the Humanities”. Clearly not a realistic proposition, though another survey told her that “reputation, unfortunately, is always based on research, and research attracts the best talent”.
Research is considered the most salient example of a nation’s intellectual resources, economic strength and global competitiveness. It is no surprise then that real research rather than just teaching and studying for exams figures highly in two of the world’s most influential league tables. The Shanghai Jiao Tong University (SJT) gives research a thumping 40% weighting, and includes the number of faculty publications and citations in journals. The Times Higher Education rank gives research 20%. But the Shanghai rank also attributes another 40% to faculty who have won Field Medals and Nobel Prizes. This is a bit distorting, since not every university or college focuses on maths or science, and even if they do, there is no clear evidence that having a Nobel Prize winner on campus now, still less having had one fifty or more years ago, helps students learn. The Shanghai survey cautions against using its tables as an overall assessment of a university’s performance. But that, of course, is exactly what people and the headline-hungry media do.
Short of hiring a Nobel Prize winner, what can a university do to improve its standing in these rankings, assuming they should try? Alas, the best way to gain prestige is to already have it.
The old schools like Harvard and Oxford are consistently in the top ten. Their mythic status draws the highest-calibre students and faculty, and guarantees generous endowments. Does reputation give them an unfair advantage and make them hard to dislodge?
Not necessarily. Other established universities as old or older than Oxford, such as Freiburg in Germany, figure much lower in the Shanghai top 100, while Italy’s Bologna, one of the oldest of all, does not even figure.
Reputations may be based partly on assumption. In its peer review weighting, the Times Higher Education queries some 200,000 academics worldwide via email, asking them to name what they feel are the top thirty institutions. With no clear indicators as to what they should evaluate, compounded by the fact that the Times Higher Education is published in English-speaking countries and that the average response rate is 1%, it is hardly a surprise when British institutions come out on top. Also, it depends on what is being measured: while neither Shanghai nor Times rate French universities that highly, in the FT 2007 European Business Schools rankings six of the top 10 are French. Incidentally, the Financial Times offers quite a range of rankings of business schools, MBAs and the like, as much to earn revenue from the “infotainment” business as anything else.
There seems to be a whimsical element in the rankings. A university high in the ranks one year may suddenly plummet forty or more places in the next. Is this possible? If so, what is to prevent it from rising fifty places the following year? Nothing, one may say. Also, there may be very little difference between institutions placed many tens of places apart, while small changes in the underlying factors have dramatic impacts. So why pay attention to rankings at all? To underline the point, the 2008 Times Higher Education survey just published as we write, reduced Britain’s rankings quite markedly across the board, leading to more gloomy headlines about the state of the country.
To squeeze out of such indicators, some universities are turning the tables upside down, using their lower position as a marketing strategy. Student mobility is higher than ever before in history. In most OECD countries, the international student body is about 6.7% of the total; in Australian universities it is 19.3%, and in some as high as 50%.
One of the biggest problems with rankings is not the accuracy of the data, but how that data is used, and to whom it is applied. Another is that a good deal of data has never been collected, how students are progressing in the classroom for instance, or levels of students who leave compared to when they joined. This makes it hard to build a full picture of university performance. The most serious omission is that certain data leaves out what for many people is the very reason universities exist: actual coursework.
Indicators such as “teacher/student ratio” are too feeble to tell us much about learning outcomes, nor anything about a teacher’s ability to teach. The Centre for Higher Education in Germany (CHE) publishes a variety of data from which students can construct their own rankings, according to their needs. The CHE does not stamp them with a number but places them in categories of “good”, “medium” or “bad”, and lists them alphabetically. Universities in one category are of comparable quality, whereas those in different categories show a marked contrast in performance. Unlike rankings, this method prevents trivial differences from creating the false impression that one university is clearly better than another.
Rankings also overlook the “value added” component of degree programmes. As might be expected, top universities draw A+ students and turn out A+ graduates. But what of universities that accept B students and produce A-level graduates? The added value of the B-student’s degree programme would be considerably higher.
The OECD Assessment of Higher Education Learning Outcomes (AHELO) study aims at determining whether it is possible to make meaningful statements about the education provided in universities in different countries, taking into account different “strands” of competence: skill in a chosen discipline, and generic skills such as critical thinking or the ability to apply knowledge practically. If successful, AHELO will provide institutions with analysis to help them improve their own performance, and will provide data that will help students assess the suitability of the institution for their own needs.
Relevance is important. Is a degree programme relevant if it fails to prepare a student for the job market or demonstrates no obvious social impact? In a world increasingly nervous about shrinking job opportunities and faced with other challenges, the relevance of a degree is paramount, and the OECD conference discussed this point at some length. But while some may argue say, that natural sciences are more relevant than social history, as Robert Berdahl, President of the Association of American Universities, pointed out, the problems of contemporary society–migration, ageing, climate change, the legacy of colonialism and religious extremism–cannot be solved by the natural sciences alone. Nor is judging what is relevant just a matter for any one generation to decide. Look at the nuclear industry, which is back in favour as a source of energy, yet is faced with skill shortages because of years of unpopularity when students chose other subjects. Or what about today’s stock market crash, which appears to be rewriting the rules of economic orthodoxy by the day?
But rankings are not going to go away. After all, they can provoke useful questions, such as “why exactly are we not in the top group?”, or indeed, “how can we maintain this lead?”. Governments and universities will still use them, as fierce competition between universities induces copycat behaviour unless policy encourages diversity. Whether they serve as an accurate guide to higher education is therefore strictly academic.
Even people that do not like rankings cannot always resist them. Take this story, related by a woman from Germany at the IMHE conference. When she asked the vice-chancellor of a university in Bavaria whether he would like his university to be included in a league table, his response was a firm “no”. But then, when the woman succeeded in reassuring him that the university would not be included, he asked–just out of curiosity–where it would have been ranked compared to the others. LT
References
Presentations from the OECD conference organised by the Programme on Institutional Management in Higher Education (IMHE) on 8-10 September: “Higher education: quality, relevance and impact” can be found at www.oecd.org/edu/imhegeneralconference2008
For information on AHELO see www.oecd.org/edu/ahelo
©OECD Observer No 269 October 2008
0 Comments