EDITORIAL

 

Scientific research assessment under discussion

 

 

The San Francisco Declaration on Research Assessment (DORA) was issued on May 16 this year. Initially signed by 155 scientists and 78 scientific institutions, it explicitly proposes eliminating the use of the impact factor (IF) in science journals "as a surrogate measure of the quality of individual research articles, to assess an individual scientist's contributions, or in hiring, promotion, or funding decisions" (http://am.ascb.org/dora/).

Calculating a journal's IF is simple. In a given year, a journal's IF is equal to the number of times the articles published by the journal in the previous two years were cited, divided by the number of articles published during that period. Originally created as an indicator to guide journal purchases by libraries at research institutions, it gradually became adopted as "THE" indicator of quality in scientific output. Even more worrisome, it came to be used for assessing individual researchers .

The mistake of using a journal's performance indicator to assess an individual article's quality has long been known to epidemiology: it is called an ecological fallacy. The IF is problematic even as a metric for assessing journals themselves. The various limitations include: reviews, which in some fields (including Public Health) are cited more frequently; the life cycle of articles, which is generally longer than two years; and the fact that one frequently cited article is enough to substantially increase a journal's IF, since means are sensitive to extreme values.

Any assessment metric or behavioral incentive measure has desirable effects, if it is a good measure, and undesirable effects due to the negative incentives it creates. The adoption of the IF in research funding decisions has been associated with an increase in cases of misconduct (Couzin-Frankel. Science 2013; 339:386-9). From the perspective of science publishers, assigning excessive weight to IF in editorial assessments can be potentially harmful to science if the choice of what to publish is influenced predominantly by the article's citation potential.

Surely, the vast majority of editors, renowned scientists who volunteer their time and skills to journals, are doing honest and serious work. However, we are experiencing a moment of major changes in scientific journal funding, in many cases leading to a shift toward a business model (a highly lucrative one at that).

In addition to the overall recommendation, DORA offers a set of specific recommendations for researchers, research institutions, organizations that supply metrics to assess journals, research funding agencies, and publishers. For the latter, the recommendations are to reduce emphasis on the impact factor and other equivalent indicators as a promotional strategy for the journal, encourage responsible authorship practices and the provision of information about the specific contributions of each author, reduce the constraints on the number of references cited in articles, make available individual article-level metrics, remove all reuse limitations on reference lists and make them available under the Creative Commons Public Domain Dedication, and mandate the citation of primary literature instead of reviews. CSP has already adopted these recommendations, with the exception of the last two, which we intend to implement in the future. At the time of this Editorial's writing, the declaration has been signed by 8,023 individual scientists and 379 organizations, including CSP. The link to sign the manifesto is still open.

 

Marilia Sá Carvalho
Claudia Travassos
Cláudia Medina Coeli
Editors

Escola Nacional de Saúde Pública Sergio Arouca, Fundação Oswaldo Cruz Rio de Janeiro - RJ - Brazil
E-mail: cadernos@ensp.fiocruz.br