Evaluación de académicos más allá de la bibliometría
Octubre 21, 2022

     Assessment of scientists needs to go beyond bibliometrics

Researchers publish their findings in order to ensure widespread dissemination of their work, primarily within a community of their peers, where it will be discussed, assessed and built upon. Good publications have the potential to enhance the reputation of the author, to attract new students and collaborators, to establish priorities and to obtain resources and salary increases.

Many funders and universities sometimes end up determining the reputation of an academic just by counting the number of their publications and citations. This, in turn, is reflected in the mechanisms, which are not always ethical, used to increase those numbers.

The effect of the pressure to publish may be seen most clearly in the increase in scientific plagiarism and fraud, much of which is relatively minor and likely to escape detection. The temptation to behave dishonestly is great. All too often the main reason for a paper seems to be simply to appear in a report which will be analysed by some committee.

Requiring international publication in high impact journals is a way for governments and universities to transfer complex assessments to external judges. Journals indexed in the Web of Science and Scopus are preferred when it comes to quality control. This tactic is widespread among developed and developing countries.

University hiring procedures

To apply for a position in higher education at an elite university in the United States, a person needs a PhD or similar degree from an excellent institution. Additionally, they need to publish extensively in their field, make new contributions, and pick up teaching experience at the university level.

In these activities, they will need to stand out from their peers in order to receive positive recommendations from faculty members familiar with their work. Hiring committees typically select from between six and 10 candidates culled from hundreds of applications when they interview for any open position. Most candidates are PhDs from other universities and some may be faculty looking to move, or well-known people from industry labs.

On the path to becoming a full professor, it is necessary to continue publishing, serve as a leader and teach exceptionally well. It may be important to publish books, or to make another type of significant contribution to your area of study.

An assistant professor on the tenure track will be evaluated on these criteria after seven or eight years of service to the university. After this period, each assistant faculty member will present a dossier filled with publications, course evaluations, a resume, peer evaluations and research summaries for review.

If approved for tenure, the applicant will be promoted to associate professor. An associate professor can later become a full professor if he or she continues demonstrating excellence in scholarship, leadership and teaching. Then he or she can obtain an endowed chair and negotiate his or her salary.

The best universities follow careful and detailed procedures, but many others do not. In the latter, the established procedures for hiring and promotion are simpler, faster and flawed, based mostly on diplomas and on the number of publications.

Some history on scientific publication

Three centuries ago, to communicate their research scientists had to have enough ideas to publish an entire book or write letters to colleagues who might be interested in their research. A group of scientists who sent letters all over Europe at the time was called the Invisible College.

In the mid-17th century, some groups conceived of a new way of communicating scientific results: coming together to share their results. These meetings grew into the first academic societies, such as the Accademia del Cimento (founded in 1657), the Royal Society (founded in 1660) or the French Academy of Sciences (founded in 1666).

In 1665, the first secretary of the Royal Society, Henry Oldenburg, had the first issue of the Philosophical Transactions printed. He published correspondence on scientific observations, experiments and reports on advances reported at meetings of the Royal Society.

Today we can still read (free of charge) the first editions on the Royal Society website. Other scientific journals soon followed, and after some time, the journal article became the main method of communicating scientific results.

The possibility of establishing priority in disputes over scientific discoveries promoted the scientific paper as the best means of communication. National pride about scientific discoveries began to grow.

After World War II, research entered a period of unprecedented growth. Science became synonymous with innovation, especially in relation to defence. The government emerged as the primary patron of scientific endeavours in the US, not just the military, but also through newly created agencies such as the National Science Foundation and the rapidly expanding university system.

This trend has been followed, with varying success, by most nations. Scientific establishments began to grow as powerful governmental bureaucracies.

The business of science publishing

Despite a restricted audience, the business of scientific publishing is very important. With a total global revenue of more than US$25 billion, it is somewhere between the music and film industry in size, but more profitable. In 2010, the science publishing arm of Elsevier reported profits of US$962 million on just over US$2.7 billion in revenue. It was a 36% profit margin – bigger than what Apple, Google or Amazon had that year.

To make money, a traditional publisher – say, a magazine – has to cover a number of costs: pay writers for articles; employ editors to commission, model and verify the articles; and distribute the finished product to subscribers and retailers.

The way to make money from a scientific article is very similar, except that scientific publishers manage to get rid of most of the real costs.

Scientists do research and write articles – largely funded by their governments – and offer them for free to publishers; most of the editorial burden – checking scientific validity and evaluating experiments, a process known as peer review – is done by scientists working on a voluntary basis.

Publishers then sell the product back to government-funded institutional and university libraries to be read by scientists who, in a collective sense, created the product in the first place.

It’s a bizarre triple-payment system where the state pays for the research, pays the salaries of those who check the quality of the research and then buys the published product.

At the end of WWII, the word ‘science’ (which politicians related to the atomic bomb and the radar) was synonymous with progress and national security. Vannevar Bush, who ran the Manhattan Project, convinced President Harry S Truman of that. Thus, the American government became the strongest supporter of the scientific effort, not just for military purposes, but through new agencies such as the US National Science Foundation and the university system.

When a new journal is published, scientists are happy to have more space to display their research and ask the university library to subscribe to it. The scientific paper has essentially become the only way in which science is systematically represented in the world. Institutions spend billions of dollars a year and in exchange receive paper or digital files.

Citation indexing

Derek J de Solla Price became a significant player in the advancement of citation indexing, when he formulated his theory on the exponential growth of science.

Looking after the university’s complete run of the Philosophical Transactions of the Royal Society, as he placed the volumes in chronological order, Price noticed that their size increased exponentially with time.

He obtained empirical statistical evidence from various fields of science, all of which showed that the mode of growth of science is exponential. The size of science represented by its manpower and number of publications doubled in size every 10 years.

If this rate of expansion is considered broadly, then from the 1600s onwards such size measures of science would have increased by a factor of 10 to the power of six, increasing at a much faster rate than the increase of total humans able to conduct it.

If science had continued to grow at an exponential rate in 1962, there would be more scientists than people. Clearly, exponential growth cannot continue indefinitely, and the slowing of growth rates will correspond to issues around the allocation of resources.

Price conjectured that exponential growth should continue until it reaches a maximum size and then cease growing.

The quantitative study of science, Scientometrics, and its application to science policy became the principal focus of Price’s work from the 1960s onwards. In 1963 his best-known book Little Science, Big Science was published.

Early in that year, he met Eugene Garfield, founder of the Science Citation Index (SCI), and they began a lasting collaboration. SCI would provide most of the data for his quantitative work, allowing studies not just of the quantity of scientific publication, but, for example, of the impact of those publications and the duration of that impact.

Price also measured the relative importance of scientific results, which he called stature, and found that it duplicates in about 30 years, rather than 10 years, the period needed to duplicate the number of papers and the number of scientists. Then, he compared the construction of new results to the construction of a pyramid, built with bricks or stones. For the pyramid to double its height, its volume has to be multiplied by eight. The most important works, those that lead to Nobel prizes, are those at the top of the pyramid.

Price’s famous conferences are reproduced in two wonderfully intelligent books, Science since Babilonia and Little Science, Big Science.

Publish or perish

‘Publish or perish’ has become a mantra in academia. The words indicate the pressure to publish as many papers as possible, even if quality is sacrificed for quantity, wherever this is the sole measure of a researcher’s merit.

Critics have pointed out how this mantra leads to the gaming of the publication system, with researchers ignoring the principles of scientific integrity in the process. Some of these critics have proposed solutions, such as the San Francisco Declaration on Research Assessment or the Hong Kong Principles. Yet, up to now, nothing much has changed.

Predatory journals still run rampant and fraud is on the rise. Researchers remain suspicious of the integrity of their colleagues’ work, but still need to publish to get ahead. The academic publishing machine pulls in tens of billions of dollars of annual revenue and has no incentive to encourage scientists to publish less.

Thomson Reuter’s Web of Science holds some 58 million items. Only 14,499 papers (0.025%) have more than 1,000 citations. Half of the total have been cited only once, if at all.

The theory of relativity, the discovery of high-temperature superconductors, the determination of DNA’s double-helix structure, the first observations that the expansion of the Universe is accelerating – all of these breakthroughs won Nobel prizes and international acclaim. These works represent the science that we admire and that governments justly support, but are not among the most cited.

Proposed solutions

At the sixth International Congress on the History of Science, Price predicted that publishing would become so complex, data-intensive and expensive that we would soon exhaust our ability to support science as an enterprise. Our ability to acquire and process data has accelerated rapidly, largely due to the revolution in fast and inexpensive computer systems, but it is still difficult to manage quality control.

Price reintroduced the notion of the Invisible College as an informal communication channel between scientists at the borders of a scientific field. Scientists used to read books and when things started moving too fast, they just read articles. Then they only read letters to the editor.

Now things are moving so fast that they survive in what we now call invisible colleges of small groups of peers in a particular specialty. Eventually they end up writing their papers so that graduate students can read them and get ahead in their research. By the time they are published, however, the subject is so old that all the juice of good research has been squeezed from it, so it is not worth reading if you’re really at the research front.

Moving away from scientific papers as the way for scientists to communicate and interact, Price outlined an idea to allow further maximisation of interactions between scientists.

Here again the ‘invisible college’, or more specifically the circuit of institutions, research centres, journals and conferences that allow intermingling and interactions within specific fields of science was an important facilitator.

Groups of scientists naturally form as a result of collaborations between individuals focusing on similar problems, but the ability for researchers to move around the globe in order to achieve interpersonal relationships with their fellow researchers maximises the group size able to keep up regular productive interactions.

Price’s proposal is for more direct communication among scientists, something that has become much easier with digital technologies. This is an important alternative that should be used by all researchers, and particularly by those at the beginning of their careers, in order to create a network which can send their (published or unpublished) results to important authors in their field.

Open access

Open access (OA) is a movement which aims to change the international infrastructure of scientific publishing, in order to make publications freely available on the public internet. So far, this transition is increasingly based on article processing charges (APC), which translates into a paywall on the researchers’ side.

One argument for OA has been that the increased availability of research results leads to the faster advancement of science. Another argument is that, since scientific research is predominantly financed by public funds, its achievements should be considered a public good. A third argument is that OA will reduce the global expense of the scientific publishing and dissemination process compared to the subscription model.

Obviously, OA indeed makes the scientific literature more accessible, but publishing fees restrict research publication to institutionalised and-or funded activities, hurting researchers in the Global South.

Even within institutions, APCs have not been widely welcomed. Most researchers in the United States hold neutral or negative opinions, believing that articles published in OA journals are of lower quality than those published in subscription-based journals. According to a 2020 survey, 79.2% of Chinese researchers selected the high publishing cost as the reason for their reluctance to publish articles in OA journals.

An important article on open access is the 2022 article “Should open access lead to closed research? The trends towards paying to perform research”.

The study analyses the global trends towards paying to perform research by combing observed trends in publishing from 2015 to 2020 in the US, China, the UK, France, the Netherlands and Norway.

The estimated global revenues from APC among major publishers now exceed US$2 billion annually. Research publishing will be closed to those who cannot make a money payment. These results led to a discussion of whether APC is the best way to promote OA.

The number of articles published in the journals doubled between 2016 and 2020, while the total revenues from APC have tripled, indicating a market of researchers and institutions willing to pay higher prices to get published.

Seven large publishers of OA publishing also dominate in hybrid or subscription-based publishing: Elsevier, IEEE, Oxford University Press, Springer Nature, Sage, Taylor & Francis and Wiley. The total APC revenues of the nine largest publishers increased by more than 50% between 2019 and 2020.

Some studies have indicated that APC outperforms subscriptions in terms of revenues per journal and that most subscription-based journals have rapidly turned hybrid. Under the current subscription model, a journal publishes material of good quality for readers who are willing to pay.

An APC framework is controlled by the writers and can operate in two ways. One is to generate revenue for the magazine by publishing as many articles as possible, regardless of quality and demand, such as predatory OAs. The other way is to be selective and create a high-quality space in which researchers strive to publish their best research. At that point, the publisher can freely set the price for publishing there.

Nature offers to publish OA for an author who pays €9,500. As described, OA can create a ‘pay-to-play’ system where only the best-funded researchers and institutions will be able to publish in the best journals.

If the subscription model was profitable before, it seems that the APC model is even more profitable. The effect of APC, which marginalises researchers in low- and middle-income countries, should be regarded as a global challenge.

The Chinese way

China produces the second largest number of articles in international journals, second only to the United States. Now, after years of requiring their researchers to publish in international journals, China’s Ministry of Education and its Ministry of Science and Technology have released a document aimed at reducing their excessive reliance on the Science Citation Index to determine academic promotions, offers of employment and allocation of funds for research.

The aim is to establish an assessment system to stimulate research that can be used to solve China’s problems.

In the future, Chinese scientists will still be encouraged to publish prominent work in major international journals (such as NatureScience and Cell), but research appearing in less influential journals will not attract more government funding. For research in basic disciplines, the evaluation will focus on the originality and scientific value of the research articles, not on their number. Additionally, applied research and research for technological innovation will focus on the real contribution to social and economic development.

At the moment, the Chinese proposal seems like a smart idea, particularly for developing countries such as Brazil. It will control the cost of research and the number of publications without impeding the growth of the best scientific ideas, prioritising quality over quantity and directing new intellectual efforts to solve practical economic and social problems.

Philip G Altbach and Hans de Wit tackle the problem in their 2018 article “Too much academic research is being published” and propose a similar idea.

They describe the crisis in academic publishing – too much pressure to publish, too many books of marginal quality, the rise of predatory publishing due to the massification in higher education and the rise of global rankings of universities.

They propose that universities that are not research-intensive should focus on teaching and service to society and industry. The research-intensive universities and appropriate professional societies and government funding and other agencies need to take much more responsibility – and control – over a system that has become overly commercialised and in part corrupted.

Change in hiring and promotion rules

In 2021, the University of Utrecht announced a reform in its hiring and promotion rules, abolishing the use of bibliometric indicators such as the impact factor to measure the relevance of the production of its professors.

In the proposed new model, researchers will be evaluated based on the quality of their teaching, commitment to working in teams and willingness to share research data. Each department must develop its own strategies to assess the performance of its professors, taking into account the effect on the economy and society and the principles of open science, a set of practices that promotes transparency and collaborative work.

The reform in the Netherlands is symbolic because it breaks with indicators whose overuse has long been criticised as reductionist. In recent years, a series of manifestos has proposed ways to make more comprehensive assessments and has gained support among universities everywhere.

The main one is the 2012 San Francisco Declaration on Research Assessment (DORA), endorsed by more than 20,000 researchers and 2,000 institutions from 148 countries, which recommends abolishing the isolated use of the impact factor of journals in relative evaluations, financing, promotions and hiring.

Another reference document is a set of guidelines defined in 2019 at the sixth World Conference on Scientific Integrity, held in Hong Kong, to assess the performance of researchers more broadly and create career rewards for those who adopt practices capable of enhancing integrity.

China’s strategy is also changing. Instead of valuing the volume of published studies, researchers began to be required to select the best contributions to be analysed by panels of experts. The Chinese announced their intention to develop their own bibliometric indicators, which contemplate the regional impact of their research.

In several countries, researchers are being urged to provide a structured narrative about their career, expressing their individual contribution, rather than listing the volume of articles and citations they have received.

The Swiss National Science Foundation is testing such a curriculum, the SciCV, which is easy to fill out and update. The Royal Society, UK, has developed a curriculum divided into four sections: knowledge generation, talent development, contribution to the research community and contribution to society. In Brazil, Jacques Marcovitch coordinates a project financed by FAPESP (Fundação de Amparo à Pesquisa do Estado de São Paulo) to develop new metrics.

In the assessment of Marcovitch, the discussion in the Netherlands highlights the advantages and limits of the two approaches. Bibliometric indicators are rational and objective, but they are known to provoke behaviour distortions and are unable to capture dimensions such as the quality of teaching. On the other hand, the detailed analysis of the scientific and academic contribution of researchers involves a much longer and more difficult process.

Final comments

Universities and funding agencies should not determine hiring, progression and grants based only on the number of papers produced, without looking at additional important information. Universities should neither recognise publications in predatory journals, nor pay for article-processing charges in them, nor support participation in predatory congresses.

Professors and early-career researchers should be advised that publishing in predatory journals will taint their curriculum. They should work hard and look to the best researchers and groups in the area of interest and enter their networks. Original, easily publishable results will automatically come forth.

Finally, they should not forget that a teacher’s most important role is to teach, preparing students for a full and productive life, inspiring them to work for a society with greater prosperity and greater mutual respect.

The basic scientific knowledge for this already exists.

Guillermo J Creus is a retired professor at Universidade Federal do Rio Grande do Sul and researcher at CNPq (Conselho Nacional de Desenvolvimento Científico e Tecnológico) in Brazil. E-mail: [email protected]

0 Comments

Submit a Comment

Tu dirección de correo electrónico no será publicada. Los campos requeridos están marcados *

PUBLICACIONES

Libros

Capítulos de libros

Artículos académicos

Columnas de opinión

Comentarios críticos

Entrevistas

Presentaciones y cursos

Actividades

Documentos de interés

Google académico

DESTACADOS DE PORTADA

Artículos relacionados

Tesis de doctorado com laude

¿Son tan extraordinarias todas las tesis doctorales? Los sobresalientes ‘cum laude’ crecen cinco puntos en ocho años El Plan Bolonia dotó de más controles a los doctorados, pero algunos académicos creen que hay una inflación de notas porque prima la relación con los...

Share This