Metrics in academic profiles: a new addictive game for researchers?

Enrique Orduna-Malea Alberto Martín-Martín Emilio Delgado López-Cózar About the authors

ABSTRACT

This study aims to promote reflection and bring attention to the potential adverse effects of academic social networks on science. These academic social networks, where authors can display their publications, have become new scientific communication channels, accelerating the dissemination of research results, facilitating data sharing, and strongly promoting scientific collaboration, all at no cost to the user.

One of the features that make them extremely attractive to researchers is the possibility to browse through a wide variety of bibliometric indicators. Going beyond publication and citation counts, they also measure usage, participation in the platform, social connectivity, and scientific, academic and professional impact. Using these indicators they effectively create a digital image of researchers and their reputations.

However, although academic social platforms are useful applications that can help improve scientific communication, they also hide a less positive side: they are highly addictive tools that might be abused. By gamifying scientific impact using techniques originally developed for videogames, these platforms may get users hooked on them, like addicted academics, transforming what should only be a means into an end in itself.

Keywords:
Bibliometrics; Academic Profiles; Addiction; Gamification; Social networks; Video games; Adverse effects; Research. ethics; research Behavior; addictive

ACADEMIC PROFILES: NEW COMMUNICATION CHANNELS IN SCIENCE

The number of researchers who use academic profiles and social networks is rapidly increasing.11. Van-Noorden R. Online collaboration: Scientists and the social network. Nature.2015;512(7513):126-9.,22. Kramer B, Bosman J. 101 Innovations in Scholarly Communication: the Changing Research Workflow [online]. (Consultado 11-09-2016). Disponible en: https://101innovations.wordpress.com
https://101innovations.wordpress.com...
Since the launch of Aminer in 2006, a pioneer academic profile service, many other actors have released their own platform, among others: ResearcherID, ResearchGate, and Academia.edu in 2008, Microsoft Academic Search's Profiles in 2009, ImpactStory in 2011, Google Scholar Citations, and ORCID in 2012, the new Scopus Author Profiles in 2014, and recently, the profiles available in Semantic Scholar, a promising new academic search engine developed by the Allen Institute for Artificial Intelligence and launched on November 2015. These platforms have already started shaping science communication, and they may very well also influence future academic impact assessment. Their current number of users (Table 1) bears witness to their widespread acceptance among the global community of researchers.

Table 1
Multidisciplinary academic platforms: documents and profiles

Each of these platforms, while providing the common synchronous and asynchronous social network features, also specializes to fit their users' interests. One of the most common features in academic social networks is enabling users to upload and share their academic contributions, whether or not they have been formally published and regardless of their typology and source. Additionally, they also make a handful of social tools available to their users (Figure 1), such as personalized alerts, open peer reviews, social networking through contacts, the possibility to make and answer academic-related questions, public and private messages, and last but not least, access to a comprehensive monitoring and technological surveillance system. In short, these platforms are a new way to communicate academic activities. They also speed up the dissemination of results, facilitate research data sharing, and encourage widespread scientific collaboration, all at no cost to the user (so far).

Figure 1
New communication features in the new academic platforms (example of some ResearchGate features)

Most academic profile services and social networks offer a wide battery of author-level metrics (Figure 2), which they usually showcase prominently on their interfaces. These may be divided into five categories: bibliometrics (publication and citation), usage, participation, rating, and social connectivity.33. Orduna-Malea E, Martín-Martín A, Delgado López-Cózar E. The next bibliometrics: ALMetrics (Author Level Metrics) and the multiple faces of author impact. El profesional de la información.2016;25(3):485-96. All user interactions (views, downloads, reads, links, shares, mentions, reviews, embeds, labels, discussions, bookmarks, votes, follows, ratings, citations, etc.) are tracked by the platform and transformed into a variety of indicators, from which researchers can get an idea of the impact their work is having in the scientific and professional communities, and the media at large, nearly on real time. The impact reflected, of course, depends on the degree to which users engage in the platform, and the variety of metrics available. Authors may have a different reflection in each of the platforms. Thus, each platform may be considered an "academic mirror".

Figure 2
Author metric display in the most important academic profiles (Mendeley, ResearchGate, ResearcherID, and Google Scholar Citations)

Policy makers, in their eagerness to find objective quantitative measures that relieve them of the responsibility of their decisions, may be tempted to use and endorse these metrics.44. Jiménez-Contreras E, Moya Anegón F, Delgado López-Cózar E. The evolution of research activity in Spain: The impact of the National Commission for the Evaluation of Research Activity (CNEAI). Res Policy. 2003;32(1):123-42. We already know how sensitive scientists are to evaluative policies inasmuch they affect the promotion and reward systems, essential cogwheels in the clockwork of Science.55. Jiménez-Contreras E, Delgado López-Cózar E, Ruiz-Pérez, R, Fernández VM. Impact-factor rewards affect Spanish research. Nature. 2002; 417(6892):898. The consecration of bibliometric indexes, with the Journal Impact Factor at the head, as the preferred criteria in the evaluation of scientists in Spain since 1989 constitutes a good case in point regarding behavioural changes induced by these policies.55. Jiménez-Contreras E, Delgado López-Cózar E, Ruiz-Pérez, R, Fernández VM. Impact-factor rewards affect Spanish research. Nature. 2002; 417(6892):898.,66. Delgado-Lopez-Cozar E, Ruiz-Pérez R, Jiménez-Contreras E. Impact of the impact factor in Spain. BMJ 2007; 334.

The compulsive obsession to use the Journal Impact Factor as a unique and indisputable measure of the quality of a scientific work quickly spread throughout many countries, giving rise to a new real disease: impactitis77. Camí J. Impactolatria, diagnóstico y tratamiento. Med Clín (Barc).1997; 109: 515-24., which recently led both to a declaration against its use (DORA: San Francisco Declaration on Research Assessment)88. American Society for Cell Biology. San Francisco declaration on research assessment [online]. (Consultado 15-09-2016). Disponible en: http://www.ascb.org/dora/
http://www.ascb.org/dora/...
and a Manifesto99. Hicks D, Wouters P, Waltman L, Rijcke SD, Rafols I. Bibliometrics: The Leiden Manifesto for research metrics. Nature. 2015;520:429-31. declaring best practices for the fair use of bibliometric indicators.

THE NEW "BIBLIOMETRIC DRUGS"

Although these "mirrors" come loaded with metrics for nearly everything, they might also bring about negative effects. They are highly addictive tools that might be abused as if they were drugs.

The first recognizable "bibliometric drug" -as we understand it today- was the Journal Impact Factor. Other metric-based drugs such as the h-index and all its derivatives came later, and now a new generation of synthetic and designer drugs has sprung from academic social networks. These new narcotics, as their predecessors, thrive on satisfying their users' egotistical needs by continuously activating their internal reward mechanisms, like any other addictive drug would do. However, the substance has evolved from one metric to an entire entertaining and immersive environment, similar to a videogame.

By gamifying researchers' impact through persuasive videogame techniques (scores, achievements, competition, unlocked features, and coming soon stages, enemies and extra lives), these platforms intend to get users hooked on scoring reputation "points", competing against one another and against themselves.1010. Hammarfelt B, de Rijcke SD, Rushforth AD. Quantified academic selves: the gamification of research through social networking services. Information Research.2016;21(2). (Consultado 11-09-2016). Disponible en: http://InformationR.net/ir/21-2/SM1.html
http://InformationR.net/ir/21-2/SM1.html...

Addict scholars will not only be more willing to game the system1111. Delgado López-Cózar E, Robinson-García N, Torres-Salinas D. The Google Scholar experiment: How to index false papers and manipulate bibliometric indicators. Journal of the Association for Information Science and Technology.2014;65(3):446-54., but will also find themselves in such a state of dependence and self-absorption that their creativity and productivity (professional dimension), and social relations (personal dimension) might be severely affected. Users may even experience episodes of depression if they feel their metrics are not as good as expected, or when a rival surpasses them or achieve a particularly high score.

The appearance of a new mental disorder (similar as those detected on young people hooked on social networks) should not be discarded. Researchers compulsively accessing to academic social platforms anytime and anywhere, expecting new downloads, citations or likes is a clear symptom that a new academic illness is born, a scholar-ache.

Faced with this scenario should we warn or prevent scientists against the (ab)use of these platforms? Should research institutions learn how to treat this new social disease?

References

  • 1
    Van-Noorden R. Online collaboration: Scientists and the social network. Nature.2015;512(7513):126-9.
  • 2
    Kramer B, Bosman J. 101 Innovations in Scholarly Communication: the Changing Research Workflow [online]. (Consultado 11-09-2016). Disponible en: https://101innovations.wordpress.com
    » https://101innovations.wordpress.com
  • 3
    Orduna-Malea E, Martín-Martín A, Delgado López-Cózar E. The next bibliometrics: ALMetrics (Author Level Metrics) and the multiple faces of author impact. El profesional de la información.2016;25(3):485-96.
  • 4
    Jiménez-Contreras E, Moya Anegón F, Delgado López-Cózar E. The evolution of research activity in Spain: The impact of the National Commission for the Evaluation of Research Activity (CNEAI). Res Policy. 2003;32(1):123-42.
  • 5
    Jiménez-Contreras E, Delgado López-Cózar E, Ruiz-Pérez, R, Fernández VM. Impact-factor rewards affect Spanish research. Nature. 2002; 417(6892):898.
  • 6
    Delgado-Lopez-Cozar E, Ruiz-Pérez R, Jiménez-Contreras E. Impact of the impact factor in Spain. BMJ 2007; 334.
  • 7
    Camí J. Impactolatria, diagnóstico y tratamiento. Med Clín (Barc).1997; 109: 515-24.
  • 8
    American Society for Cell Biology. San Francisco declaration on research assessment [online]. (Consultado 15-09-2016). Disponible en: http://www.ascb.org/dora/
    » http://www.ascb.org/dora/
  • 9
    Hicks D, Wouters P, Waltman L, Rijcke SD, Rafols I. Bibliometrics: The Leiden Manifesto for research metrics. Nature. 2015;520:429-31.
  • 10
    Hammarfelt B, de Rijcke SD, Rushforth AD. Quantified academic selves: the gamification of research through social networking services. Information Research.2016;21(2). (Consultado 11-09-2016). Disponible en: http://InformationR.net/ir/21-2/SM1.html
    » http://InformationR.net/ir/21-2/SM1.html
  • 11
    Delgado López-Cózar E, Robinson-García N, Torres-Salinas D. The Google Scholar experiment: How to index false papers and manipulate bibliometric indicators. Journal of the Association for Information Science and Technology.2014;65(3):446-54.

  • Suggested citation:

    Orduna-Malea E, Martín-Martín A, Delgado López-Cózar E. Metrics in academic profiles: a new addictive game for researchers? Rev Esp Salud Publica. 2016 Sep 22;90:e1-5.

Publication Dates

  • Publication in this collection
    20 Mar 2017

History

  • Received
    21 Sept 2016
  • Accepted
    21 Sept 2016
Ministerio de Sanidad Madrid - Madrid - Spain
E-mail: resp@sanidad.gob.es