New ways of evaluating research – Open Access leads the way


By: Emma Palmqvist Wojda and Andreas Hedström Mace

What do we mean when speaking about impact? As ways to evaluate research are becoming more in demand, the assessment of impact is requested as a measure of quality and visibility. Traditionally, the metrics developed to assess impact are based on the assumption that most research outcomes sooner or later will be published as articles in scholarly journals, which other researchers read and later cite in their own articles. The number of citations then determines the impact. However, other means of evaluating research and new ways of assessing impact are currently being developed as a series of alternative metrics, commonly referred to as Altmetrics.

Traditional bibliometrics, dominated by the frequently questioned Impact Factor from Thomson Reuters, measures citations on a general level where the level of impact is often assigned to a journal or researcher. Altmetrics instead focus on article-level metrics, where more types of impact are traced and examined. For example measuring the number of downloads of an article, how often it’s mentioned in social media or how many times it’s been bookmarked using reference management software.

Open Access is interlinked with this broadening of the term impact, as the evolving publication patterns within academia, with increasing accessibility, change the way how and by whom scholarly material is used. Studies have shown that an increased accessibility online through Open Access, leads to a significant increase in downloads for a scholarly article. This is however not always reflected in citations, which points to a gap in evaluating impact where Altmetrics can be useful.

This focus on metrics that evaluate on the article level can also be traced to developments within the Open Access field, with the creation of so-called mega journals like PLOS One, Peer J, or SAGE Open. These are Open Access publications that publish a vast amount of articles, making measurements based on journal-level difficult. For this very reason PLOS has developed an article-level metric for its content.

Altmetrics is however not exempt from criticism, especially since several of the alternative metrics use sources not usually connected to scientific work like Twitter or Facebook. There is an inherent uncertainty about what these new metrics actually mean – what does it say about the quality of the research when an article is mentioned a hundred times on Twitter? At the same time, it has been found that there is a connection between how an article is discussed on Mendeley and with later citations, a relevant impact that can be picked up much earlier using Altmetrics than with traditional bibliometrics.

Its speed is also one of the main advantages of Altmetrics. Evaluating impact based on citations has an intrinsic inertia – it could take years before the first citations are registered. With a wider definition, the impact could instead be measured within a matter of days or weeks. As a new PhD student or a researcher within a new field with untraditional publication patterns this can be a significant difference.

There are several providers of Altmetrics, where Altmetric, ImpactStory and Plum Analytics are among the more prolific ones. The first two offer free services, while Plum Analytics is a purely commercial endeavor. Both Altmetric and Plum Analytics offer commercial tools that allow deeper evaluations and comparisons. The best way of getting started with Altmetrics is to be active and try it out for yourself! Most of the services mentioned here are easy to use and you can get started quickly. But remember that Altmetrics is still in its infancy, under constant development and discussion.

We at the library are very interested in continuing to work with Altmetrics in order to evaluate what metrics are valuable for researchers at Stockholm University, but we need your help! If you think that this could be something that you would be interested in taking a closer look at, or if you are perhaps already doing so, we would like to get in touch with you!

Please contact Camilla Hertil Lindelöw or Thomas Neidenmark.

Useful links:

R. Mounce, 2013, “Open Access and Altmetrics: Distinct but Complementary”, Bulletin of the Association of Information Science and Technology.

P. Loria, 2013, “Altmetrics and open access: a measure of public interest”, Australian Open Access Support Group. 

P. Davis et al, 2008, “Open access publishing, article downloads, and citations: randomised controlled trial”, British Medical Journal. 


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s