Melero: Altmetrics – a complement to conventional metrics

Introduction

If we look back just a few decades, with the birth of the Internet, the publishing world suffered a revolution comparable with the invention of printing (1), both in its effects on communication and in its ability to connect environments in ways inconceivable in the printing era. Internet and the new technologies accelerate scientific communication, facilitate collaboration between academic working groups and allow new ways of assessing scholarly outputs. The creation of service providers that harvest and capture different information sources, the implementation of the semantic web and protocols that promote interoperability between different systems are the result of this revolution from which there can be no return. From the point of view of scholarly publishing, journals have also undergone a change in their ways of dissemination and distribution. Changing the print world to digital has allowed journals to gain a different perspective: from indivisible units, they have become a product composed by items (articles) that acquire their own entity. The breakdown of journals into articles as individual units facilitates traceability through the web, monitoring and individual search.

Traditional systems to assess the impact of a publication are based on the track of citations to a journal and it is how Journal Citation reports impact factor (IF) was conceived. Similarly, the SCImago Journal & Country Rank (2) has created the Scimago Journal Rank (SJR) an indicator for journals contained in the Scopus database. Both indicators are based on citations, but their calculation differs. The IF is calculated by dividing the number of citations in the JCR year by the total number of articles published in the two previous years (3). The SJR is also based on citations, with a time window of three years, and was developed from algorithm Google PageRank. However, the aim of this paper is not the analysis of conventional metrics based on citations to the journals, but the new ones arising from the semantic web, where tracking does not focus on journals, but on articles and their authors. The Thomson Reuters IF has been questioned many times, but it is still used to evaluate researchers’ activity. In fact, the abuse of the use of the journal IF as the unique indicator to assess research efficiency was discussed during the Annual Meeting of The American Society for Cell Biology (ASCB) in San Francisco, CA, on December 16, 2012. As a result, a group of editors and publishers of scholarly journals developed a set of recommendations, referred to as the San Francisco Declaration on Research Assessment (DORA) (4). The statement claims not to use the Thomson Reuters journal impact factor to assess research outputs, a general recommendation requests:

Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions”.

Among recommendations for researchers, DORA advises:

Use a range of article metrics and indicators on personal/supporting statements, as evidence of the impact of individual published articles and other research outputs”.

DORA Declaration also encourages funding agencies, institutions, publishers, and researchers to adopt practices in research assessment which consider the value of all research outputs not only papers, and recommends also to specify clearly the criteria used to research assessment exercises, and to tenure and promotion decisions. DORA declaration has been signed by more than 500 organizations and more than 12,000 individuals and is still open for new signatories.

In this regard, Nature has recently published the Leiden Manifesto for Research Metrics (5), named after the Science and Technology Indicators (STI) Conference in Leiden in 2014, at which it arised.

The statement consists of 10 principles to guide research evaluation. This guide responds to the worries about evaluation of research performance and the abuse of quantitative methods based in a sole indicator. The ten principles are extensively developed in the article, and they are:

  1. Quantitative evaluation should support qualitative, expert assessment.

  2. Measure performance against the research missions of the institution, group or researcher.

  3. Protect excellence in locally relevant research.

  4. Keep data collection and analytical processes open, transparent and simple.

  5. Allow those evaluated to verify data and analysis.

  6. Account for variation by field in publication and citation practices.

  7. Base assessment of individual researchers on a qualitative judgment of their portfolio.

  8. Avoid misplaced concreteness and false precision.

  9. Recognize the systemic effects of assessment and indicators.

  10. Scrutinize indicators regularly and update them.

The authors of the Manifesto concluded with a reflection about next steps:

“Abiding by these ten principles, research evaluation can play an important part in the development of science and its interactions with society. Research metrics can provide crucial information that would be difficult to gather or understand by means of individual expertise. But this quantitative information must not be allowed to morph from an instrument into the goal.

The best decisions are taken by combining robust statistics with sensitivity to the aim and nature of the research that is evaluated. Both quantitative and qualitative evidence are needed; each is objective in its own way. Decision-making about science must be based on high-quality processes that are informed by the highest quality data.”

The European Commission’s public consultation on ‘Science 2.0: Science in Transition’ (6) also included some issues about the use of new metrics (altmetrics). Consultation closed in September 2014, and the results of the consultation were discussed in several workshops. However, the draft policy brief has still to be validated (7). It seems there was not a consensus regarding the extent to which research evaluations should take new metrics (such as altmetrics) into account, and that further debate and efforts are needed before introducing altmetrics, to make clear their meaning and how to use them appropriately.

National Information Standards Organization (NISO) Alternative Assessment Metrics (Altmetrics) Initiative is a project funded by the Alfred P Sloan Foundation to undertake a study based on the development and adoption of new assessment metrics (8), which include usage-based metrics, social media references, and network behavioral analysis. The project will finish by a public consultation of the final document elaborated by several working groups that will take into account these topics:

  • Development of specific definitions for alternative assessment metrics.

  • Definitions for appropriate metrics and calculation methodologies for specific output types.

  • Development of strategies to improve data quality through source data providers.

  • Promotion and facilitation of use of persistent identifiers in scholarly communications.

  • Descriptions of how the main use cases apply to and are valuable to the different stakeholder groups.

As seen by the former statements and initiatives, there is a true concern about the use of new tools that might contribute to research assessment based on diverse criteria.

Article-Level Metrics (ALM)

In 2010, Priem and his colleagues launched a manifesto that stated the principles of altmetrics (9):

“No one can read everything. We rely on filters to make sense of the scholarly literature, but the narrow, traditional filters are being swamped. However, the growth of new, online scholarly tools allows us to make new filters; these altmetrics reflect the broad, rapid impact of scholarship in this burgeoning ecosystem. We call for more tools and research based on altmetrics.”

They defined altmetrics as the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship. Altmetrics are not synonymous with alternative metrics. Therefore, emerging metrics based on article-level and do not exclude traditional metrics based on citations to the journal, but complement them. Both can be employed in conjunction to offer a richer picture of an article use from immediate to long terms. Altmetrics normally are early available and allow to assess the social impact of scholarly outputs, almost at real time.

Article-level metrics aggregate a variety of data sources that taken together quantify the impact of an article in terms of social immediacy and visibility (9). Immediacy is important because dissemination of scientific outputs is faster than before and occurs across more channels (blogs, social networks tools, etc.) than in the print age. Socialization or social visibility: interactions, comments, mentions in Twitter, Facebook, LinkedIn could potentially reach a broader audience. ALMs are more granular than traditional models, because ALMs trace the impact of individual articles as standalone entities. They are also more immediate, because they track at real time posts or comments on blogs, Facebook and Twitter, among other social media (Figure 1).

Figure 1

How is the measure of impact in terms of traditional metrics and altmetrics (9). Measurements at personal and article-levels increase granularity (disaggregation from journals as entities). Altmetrics provide impact at real time (immediate), while traditional methods need longer times to facilitate impact data.

bm-25-152-f1

Sources used for the aggregation or compilation of data can be broken down into five categories (10, 11):

  • Usage: views and downloads, from the journal site or from a third party - e.g., PubMed Central;

  • Captures: bookmarks in CiteULike bookmarks, shared within Mendeley or Delicious;

  • Mentions: blog posts, Wikipedia articles, comments, reviews;

  • Social media: Tweets, Google+, Facebook likes, shares and ratings;

  • Citations: Web of Science, Scopus, CrossRef, PubMed Central.

This paper aims to give a brief overview of the existing ALM tools, and some examples of academic networks, which offer some statistics that complement traditional impact research analysis. Among those networks, the most popular are ResearchGate.net and Academia.edu: both allow uploading different type scholarly outputs (papers, scientific communications, lectures, etc.). ResearchGate.net is a network dedicated to science and research; it connects, collaborates and discovers scientific publications, jobs and conferences all for free (www.ResearchGate.net). Academia.edu is a platform for academics to share research papers with the aim to accelerate research worldwide (https://www.academia.edu/). Academics and researchers use Academia.edu not only to share their research, but also to follow up the impact of their research, and track the research of academics they follow through statistics provided by the service, also for free. The only restriction is copyrighted works if authors are not allowed uploading there. Both platforms share common aims and the difference between them are some services they provide. Academia.edu facilitates analytics of uploaded documents in terms of views and downloads, but Researchgate.net has a tool to get answers to your research questions and share your expertise.

Article-Level Metrics tools

Table 1 presents a summary of the sources used for the aggregation of information data for the altmetrics tools describe bellow.

Table 1

Sources used for the aggregation of information data by the four Article-Level Metrics (ALM) tools described in this article (ALM-PLoS, Altmetrics, ImpactStory, Plum Analytics). Sources can be broken down in 5 categories usage, captures, mentions, social media, and citations (10).

Article-
/metrics level tool
Main categories of sources for aggregation of information
Coverage
Usage
Citations
Captures
Social Media
ALM-PLoS Papers from PLOS PLOS and PubMed Central PubMed Central, Scopus, ISI Web of Science, and CrossRef CiteULike, Mendeley, Reddit, Google+, Stumble Upon Connotea Twitter, Facebook, Google Blogs, Researchblogging.org, Nature Blogs
Altmetric Scholarly articles PubMed, Arxiv or pages containing a DOI Scopus, Web of Science CrossRef CiteULike, Mendeley Twitter, Facebook, Blogs, YouTube, Google +, Pinterest, Wikipedia, Weibo users, Redditors
ImpactStory All the research products (Journal articles, blog posts, datasets, and software…) PLOS, PubMed, ArXiv, slideshare, vimeo, youtube, Dryad package views, figshare views, webpages (from Impactstory), ScienceSeeker, ORCID) Scopus, Web of Knowledge, Highwire, Google Scholar Citations, Pubmed CiteULike, Mendeley, CrossRef, Vimeo, Figshare, Github, Slideshare, Youtube, Delicious Twitter, Facebook, Blogs, Figshare, Wikipedia, Vimeo, Youtube, Slideshare, Delicious, GitHub
Plum Analytics Journal articles, books, videos, presentations, conference proceedings, datasets, source code EBSCO, PLOS, bit.ly, Facebook, GitHub, Dryad, Figshare, Slideshare, Institutional Repositories, WorldCat. CrossRef, PubMed Central, Scopus, USPTO CiteULike, Delicious, Slideshare, YouTube, GitHub, Goodreads, Mendeley, Vimeo Facebook, Reddit, Slideshare, Vimeo, YouTube, GitHub, StackExchange, Wikipedia, SourceForge, Research Blogging, Science Seeker, Amazon, Google Plus, Twitter via DataSift

Public Library of Science – Article-Level Metrics (PLoS-ALM)

Public Library of Science – Article-Level Metrics (PLoS-ALM) was launched in 2009 to provide article-level metrics on every article across all PLoS journals with updated data falling into the following categories: viewed, cited, saved, discussed and recommended (http://www.plosone.org/static/almInfo/#static-content-wrap). Every article has its own metrics based on the above categories, and these include citations in such conventional sources as Scopus, Web of Science, PubMed Central and CrossRef. If the article is saved in CiteULike or Mendeley, PLoS-ALM indicates how many times the article has been bookmarked in those portals, and links to the corresponding record with all the services provided by those platforms. PLoS-ALM uses also social networks to view where articles have been discussed specifically in Twitter, Facebook, blogs and the comments received on the publishing platform. All this information is displayed in a record which also includes a visualization of article usage and citations as a function of age, and article usage and Mendeley bookmarks as a function of time. In summary, PLoS-ALM provides cites in recognized citation indexes, but also captures data from social networks and platforms where the article has been referenced and uploaded. Visualization of article usage as a function of time allows seeing where and when and for how long the article has been cited. The application programming interface (API) for ALM is freely and publicly available.

Altmetric

Altmetric (www.altmetric.com) is supported by Digital Science, a Macmillan company focused on technology to aid scientific research. It aggregates information from three main sources: social media like Twitter, Facebook, Google+, Pinterest and blogs; traditional media - both mainstream (The Guardian, New York Times) and science specific (New Scientist, Scientific American); and online reference managers like Mendeley and CiteULike. It calculates a score for an article based on its mention in those sources. This is a quantitative
measure of the quality and quantity of attention that the article calculated by an algorithm (12). The higher the score the more “popular” is the article, since it has been more mentioned and in more sources. Besides the score, Altmetric creates a circle with different colors, and each color represents a different source. For instance, blue represents how widely the article has been tweeted. Altmetric has a third web application for Scopus, installed by default for Scopus users (13). Altmetric has been adopted by publishers like Springer, Nature Publishing Group and BiomedCentral, among others (Figure 2). Altmetric also supports repositories; see for example, the institutional repository of Queensland University of Technology (Figure 3). Altmetric can also serve to assess the impact of a research project to the corresponding funder (14) providing information about dissemination and discussion among colleagues of the research products derived from the project. In summary, this tool provides an insight into mentions in non-conventional sources that complements the classical ones based on citations. It has not demonstrated that more citations cause more impact on altmetrics or vice versa, partly because this depends on the time of publication, subject, discipline, habits of researchers and users, and the type of access to the article, but manuscripts published in an open access journal or freely available on the web seem to increase altmetrics responses (15). Altmetric is for a profit service, and provides a commercial API for customers with the whole application services and a free non-commercial license that allow to retrieve basic altmetrics data about articles.

Figure 2

Screenshot showing two examples of two papers that received a high attention online according Altmetric data.

bm-25-152-f2
Figure 3

Screenshot of an item deposit in the institutional repository of Queensland University of Technology, showing a summary of the article statistics, including altmetrics (left bottom corner of the screenshot).

bm-25-152-f3

Impactstory

Impactstory (https://impactstory.org/) is a web service supported by the Alfred P Sloan Foundation, the National Science Foundation and Joint Information Systems Committee (JISC). It is a not-for-profit service, but last year a new program for users began, which currently charges a fee of $60 per year. Users create their CVs by uploading their works (articles, slides presentations, code, datasets, posters and web pages). For any item, Impactstory collects where the article has been cited (from Scopus database), where the work has been viewed and read (from Mendeley), how much it has been discussed (measured by number of tweets and comments on blogs), and the number of Impactstory views. In case of software products deposited in GitHub, Impactstory provides links to comments and recommendations made in that repository. Figure 4 shows all the features of Impactstory profile. Besides the statistics, Impactstory provides also information about how to reference any item, its DOI, if any and PubMed ID if any too, and allows CVs to be downloaded in comma separated value (.csv) format.

Figure 4

Fictional example of an Impactstory profile. Users create their CVs by uploading their works (articles, slides presentations, code, datasets, posters and web pages) and Impactstory provides various statistics including information about how to reference any item, its DOI, PubMed ID, and allows CVs to be downloaded in comma separated value (.csv) format.

bm-25-152-f4

Plum Analytics

Plum Analytics (http://www.plumanalytics.com) was founded in 2011 by Andrea Michalek and Mike Buschman, but was acquired by EBSCO Company in January 2014. Plum Analytics serves to figure out more accurate ways of assessing research by analyzing the five categories of metrics: usage, captures, mentions, social media and citations. According to information provided in its portal, metrics are gathered around what Plum Analytics calls artifacts, that include: articles, blog posts, book chapters, books, cases, clinical trials, conference papers, datasets, figures, grants, interviews, letters, media, patents, posters, presentations, source code, theses / dissertations, videos and webpages. Plum Analytics harvests the data from numerous providers, including blogs, social networks, EBSCO databases, Scopus, Figshare, GitHub, Vimeo, Dryad, among others. The information collected is presented in a variety of ways including data visualizations, dashboards, and widgets. PlumX offers a new way to summarize and visualize not only the influence of researchers, but also groups, and institutions, as for example the case of University of Pittsburgh (16), in this case The University of Pittsburgh has also embedded PlumX widgets in their institutional repository (17).

Plum print is the visualization tool, which displays the research impact of a researcher’s works in five different categories: usage, capture, mention, social media and citation. Plum print substitutes the previous visualization tool, Sunhurst, in which the relative research impact by type of documents could be seen.

No pricing information has been found in the portal, at least at the time of writing this article.

Final remarks

Altmetrics do not represent, at least currently, an alternative to the traditional methods to measure the impact to research outputs, but they complement them. Altmetrics could offer a very fast view about the social impact of science. They offer wide spectra of the immediate visibility of publications in social networks characterized by a very fast dissemination in the web. However, they have still some obstacles to overcome (18): theoretical (understanding of their meaning), methodological (data sources) and technical issues (issues related to normalization of referred sources). Another limitation of using altmetric data is that there are different behavior patterns comparatively between different disciplines (this also happens in traditional metrics) and the social media different disciplines might adopt (19).

There is no clear evidence that metrics based on social networks correlate with traditional metrics based on citations (20), however some association exists and papers that are highly cited or downloaded and also highly tweeted (18, 21).

Almetrics can provide a measure of an article’s mention and discussion on web sites. The fact that an article is discussed enthusiastically does not mean the article is of low or high quality, but of interest among readers. Social media can also contribute to swift dissemination from open access outputs, not only because they are open but also because the announcement of publication spreads across a very wide community, and if the work is freely accessible, can be downloaded and potentially citable even sooner. See, for instance, the effect of Twitter (22) on downloads of a paper deposited in a repository (Figure 5).

Figure 5

Effect of social networks (Twitter) on the impact and downloads of an open access paper deposited in a repository (22).

bm-25-152-f5

Notes

[1] Conflicts of interest None declared.

References

1 

Harnad S. Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge. Publ Access Comput Syst Rev. 1991;2:39–53.

2 

Scimago Journal Rank. Available at: http://www.scimagojr.com/journalrank.php. Accessed April 28, 2015.

3 

Journal Impact factor. Available at: http://admin-apps.webofknowledge.com/JCR/help/h_impfact.htm). Accessed April 28, 2015.

4 

San Francisco Declaration on Research Assessment. Available at: http://am.ascb.org/dora/. Accessed April 28, 2015.

5 

Hicks D, Wouters P, Waltman L, Rijcke S, Rafols I. The Leiden Manifesto for research metrics. Nature. 2015;520:429–31. https://doi.org/10.1038/520429a

6 

European Commission’s public consultation on 'Science 2.0: Science in Transition'. Available at: http://ec.europa.eu/research/consultations/science-2.0/consultation_en.htm). Accessed April 28, 2015.

7 

Validating the 'Science 2.0' consultation. Available at: http://scienceintransition.eu/home/validating/). Accessed April 28, 2015.

8 

Alternative Assessment Metrics NISO. (Altmetrics) Initiative. Available at: http://www.niso.org/topics/tl/altmetrics_initiative/#phase2. Accessed April 28, 2015.

9 

Priem J, Taraborelli D, Groth P, Neylon C. Altmetrics: A manifesto, 26 October 2010. Available at: http://altmetrics.org/manifesto. Accessed April 28, 2015.

10 

Tananbaum G. Article‐Level Metrics A SPARC Primer. Available at: http://www.sparc.arl.org/sites/default/files/sparc-alm-primer.pdf. Accessed April 28, 2015.

11 

Cave R. Overview of the altmetrics landscape. Available at: http://www.slideshare.net/rcave/overview-of-the-altmetrics-landscape. Accessed April 28, 2015.

12 

13 

Altmetric for Scopus. Available at: http://support.altmetric.com/knowledgebase/articles/83246-altmetric-for-scopus. Accessed April 28, 2015.

14 

Piwowar H. Altmetrics: Value all research product. Nature. 2013;493:159.

15 

Mounce R. Open access and altmetrics: Distinct but complementary. Bull Am Soc Inf Sci Technol. 2013;39:14–7. https://doi.org/10.1002/bult.2013.1720390406

16 

The University of Pittsburgh’s PlumX. Available at. https://plu.mx/pitt/g/. Accessed April 28, 2015.

17 

D-Scholarship@Pitt. Available at: http://d-scholarship.pitt.edu/. Accessed April 28, 2015.

18 

Torres-Salinas D, Cabezas-Clavijo Á;, Jiménez-Contreras E. [Altmetrics: nuevos indicadores para la comunicación scientífica en la Web 2.0]. Comunicar 2013;41. (in Spanish)

19 

Brigham TJ. An Introduction to Altmetrics. Med Ref Serv Q. 2014;33:438–47. https://doi.org/10.1080/02763869.2014.957093

20 

Haustein S, Costas R, Larivière V. Characterizing Social Media Metrics of Scholarly Papers: The Effect of Document Properties and Collaboration Patterns. PLoS One. 2015;10:e0127830. https://doi.org/10.1371/journal.pone.0127830

21 

Eysenbach G. Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact. J Med Internet Res. 2011;13:e123. https://doi.org/10.2196/jmir.2012

22 

Terras M. Using social media to disseminate research outputs: a personal tale. Available at: http://www.rsp.ac.uk/documents/get-uploaded-file/?file=SocialMedia_MTerras.pptx. Accessed April 28, 2015.