Artículo publicado por el diario británico Guardian, en la sección Education Guardian, día 30 de octubre, sobre el financiamiento de las ciencias en el Reino Unido para el año 2008.
El Reino Unido posee uno de los sistemas de I+D más avanzados del mundo, el cual se halla sometido a una enorme presión evaluativa a través del Research Assessment Exercise (RAE), ejercicio que levanta –como muestra este artículo— una fuerte polémica en los medios académicos del país.
Dado que este es un tema de ardua discusión también en Chile, se recomienda su lectura. Agradezco a Daniel Uribe su llamado de atención sobre el artículo del Guardian.
Ver texto completo más abajo.
Recursos asociados
Innovación retardada en los países de Europa: ¿qué hacer?, 29 octubre 2007
Nuevas tecnologías e innovación e la educación superior y el desarrollo regional, 29 octubre 2007
Innovación: hacia una nueva geografía de las ideas innovativas. El caso del Asia: China, India y la República de Corea, Proyecto Atlas of Ideas, del Think Tank Demos, cuya influencia sobre los últimos gobiernos laboristas del Reino Unido es bien conocida, 15 agosto 2007
The science of funding
Who will win and who will lose in the imminent decision on research grants? Natasha Gilbert and Anthea Lipsett survey the battle into 2008 and beyond
Tuesday October 30, 2007
The Guardian
Universities are nearing the finish line of a six-year race for research funding. They will have to wait more than another year to find out the results, but the findings could make or break several institutions.
The latest research assessment exercise (RAE) – the tortuous tool used to judge the quality of research done in UK universities and allocate funding – takes place in 2008 after a seven-year hiatus. The census date for academics to be counted passes this week, while the November 30 deadline for submitting research for judgment looms large. It has been a long battle, and one that has obsessed academics and their managers since 2001. All are keen to outdo each other.
Big changes are afoot in the way quality is judged after 2008, but this time panels of academics evaluating each others’ work will still hold most sway, though the research money universities have and the numbers of PhDs they produce will be considered.
The scale of the operation is mind-boggling. Roughly 1,100 academics and industry specialists from around the world sit on 67 subject panels. Their chairs sit on 15 main panels that will ensure that the judgments they make are fair.
Most panels have promised to read each and every one of the research outputs submitted, which will amount to hundreds of hours of work over the next year. And billions of pounds rest on the decisions they make. In 2006-07, Higher Education Funding Council for England (Hefce) doled out £1.34bn in research funding, partly based on how universities fared in the last RAE. But contracts and money from the research councils and private companies also ride on how well universities do.
No one knows what the panels will decide, or how Hefce will use the results to allocate funding. But it is already clear the big research universities will tighten their grip at the head of the field.
Tables compiled by Evidence for Education Guardian show the big research players – Cambridge, Manchester and Oxford universities, along with Imperial and University College London – in the top slots. They draw on data from 2002 to 2006 encompassing science, arts and humanities subject areas.
While this is a “blunt instrument” for measuring universities’ potential to do well in the next year’s RAE, it does give a first indication of how they will fare.
In terms of research papers and impact, the results are surprising. The Institute of Cancer Research tops the table because of its specialised nature and the funding it gets as a result. But St Georges, University of London, and Dundee University come second and third, with the most highly cited research papers. According to Evidence’s figures, the 2,982 and 3,810 research papers they produced – as recorded by Thomson Scientific, the company that monitors published research papers and how much they are cited – had among the highest impact.
On PhD numbers (all doctoral qualifiers between 2002 and 2006) and research income, the “big five” jockey for the top spots. But in the top 10 for PhDs are Edinburgh (2,344), Birmingham (2,722), Nottingham (2,344), Sheffield (2,179) and Leeds (2,152) universities. The universities of King’s (£496,040), Glasgow (£382,513) and Southampton (£376,200) join them on research income.
Stranglehold
This stranglehold of the big research players can be explained: larger universities have more money, more academics and more research going on overall. Impact factors play in the science subjects, which are published and cited by others more often than in the arts and humanities.
But the fear is that the new number-crunching “metrics” system that is due to come in after 2008 will cement the existing hierarchy.
Steve Smith, vice-chancellor of Exeter University and chairman of the 1994 group of small research universities, says: “The real difficulty in all of this is using historic data, which may well refer to people who have left but not people who have joined universities in the last couple of years.” (When research papers are published they are monitored and ranked according to how often they are cited – what “impact factor” they have.)
Exeter, for instance, has taken on 300 new academic staff since the last exercise, Smith says. “We will be submitting almost all of staff whose performance has been looked at by external assessors. But a large percentage weren’t here to get impact factors from research papers published up to five years ago.”
Impact factors, PhDs and grants are all historic – but not publication output, he says. “There’s no point in knowing how many runs a cricketer got in 2005. What matters is how they are playing now.”
A study by the vice-chancellors’ group Universities UK on metrics is due out by mid-November. A key concern is that, whether universities win or lose, the system will irrevocably skew the very data it sets out to measure.
And later in November, Hefce plans to present for consultation how it thinks metrics should work and what metrics will be used, along with peer review, for arts subjects.”
Universities have had to get to grips with a number of fundamental reforms to how next year’s RAE will work compared with previous assessments.
Some of the changes were meant to stop universities playing games to win a higher position in league tables, including “poaching” staff from other institutions and not submitting staff who could bring a department’s average score down.
Reforms to RAE 2008 were also aimed at taking better account of interdisciplinary research and academics at an early stage in their careers. But some universities say the reforms have made things worse, and that the games continue.
Almost every UK university has set a minimum quality requirement for staff to reach to be included in the institution’s submission to the RAE, says Tim Benton, Leeds University’s pro-dean for research in the faculty of biological sciences.
“One of the main aims was for RAE 2008 to be more inclusive so that everyone in a department is submitted irrespective of quality. But in practice this has not worked,” he says.
“There are a number of researchers here who might have a big corpus of papers but who do not have four papers of high quality, so they are excluded. This is partly driven by the pressure to feature high up in league tables. If you have a significant number of low-ranked papers, inevitably you will fall in the tables,” he says.
“The transfer market in star researchers is alive and kicking this time round as much as last time,” Benton says.
An influential academic from another Russell group university said 20% of research-active staff in his department have been excluded from RAE 2008.
“It is game-playing much worse than last time. We returned almost everyone in the last RAE. There are some very upset people. They do a good job and have published enough papers, but we can’t take the risk that they will bring our average down, so we are not returning them.”
One research scientist at a Russell group university who has been omitted from the assessment said at first he was upset, but then thought, “what the hell”.
“I am being invited to speak at many international conferences, so I know I am doing the right things,” he says. The scientist works in an interdisciplinary field and on applied research and brings in £400,000 to £600,000 a year in grants, putting him in the top 10%-15% of earners for the university. This is the first year he has not been included in the university’s submission.
Because he works in a niche and novel area, his papers will be cited by fewer researchers and so will have a lower impact than papers published in an established field and a well-known journal, he says.
The RAE is counterproductive to the government’s aim of translating research into useful products and services, and encourages run-of-the-mill research, he adds.
“The system focuses too much on publications and impact and not enough on real-life measures, like spin-off companies,” he says. “It inhibits research into novel and niche areas where the most important future discoveries and research applications will be found. It’s nonsensical.”
Hard times ahead
Wendy Hall, an expert in computing at Southampton University and a member of the RAE assessment panel for computer science and informatics, says interdisciplinary research and young academics will have a much harder time in RAE 2008.
“For interdisciplinary research and for young researchers, the 2008 RAE will be much worse than last time. Because of the nature of interdisciplinary research, it will not have the impact on the field that a single disciplinary paper will have”, she says. “The 2008 RAE works against interdisciplinary research or anything novel because papers are judged on past impact. At least in the last assessment, we were also judged on our future strategy and plans.”
“At Southampton, we have put a lot of effort into research at the interface of the life sciences and computing and electronics. That clearly can’t have paid off just yet,” Hall explains.
“We also have young people developing new disciplines, so their papers will not yet have a great impact. We are concerned we will be penalised for this. Whereas if a department sticks with the traditional theory of computer science, for example, it will have more papers published in established journals and it will look better because the impact of the papers is higher.”
0 Comments