PLOS ONE – Measuring Article Impact

citation counts2

A common misconception of PLOS ONE is that just because we don’t consider perceived impact or novelty when deciding what to publish, doesn’t mean we don’t care about the impact of articles we publish. We of course understand that some papers are more impactful than others. That’s why we’re committed to developing new tools that realistically and unbiasedly evaluate how our papers shape their fields.

The number of citations an article collects offers one perspective on how the work has influenced its field, and is one of the many diverse measures that PLOS Article-Level Metrics provide to help the community measure article impact (others include usage and social sharing).

We recently plotted all citations to every PLOS ONE paper published in 2010 (thanks to our ALM guru Martin Fenner, and to Scopus for the data in the graph above)

The graph tells an interesting story about the range of papers published in PLOS ONE, showing that, from ground-breaking, highly-cited research to small studies that appeal to niche audiences, the journal really is for all of science. But another important thing that arose from this analysis was how much the variability in citations came from the range of subjects we publish. Fields like cell biology are huge and well-funded, with thousands of research groups around the world publishing tens of thousands of papers, while others such as ophthalmology are quite small, with only a few groups actively publishing research. All those extra cell biology papers mean lots of extra citations for the whole field, so papers in this area receive many more citations overall compared to ophthalmology, where only a few hundred papers are published each year.

The catch-all nature of journal metrics, such as the Impact Factor, means that PLOS ONE is considered a ‘top journal’ in the field of ophthalmology, as its Impact Factor is higher than any specialist journal in that field, whereas in the cell biology world we are ‘mid-level’. To address this discrepancy between fields, PLOS now includes relative metrics on all our papers, so readers can see the activity around a paper (just page views so far) relative to others in its field. As a result, you can see at the article level the impact of specific research on its field.

My feeling is that PLOS ONE has a wider citation distribution than most other journals, although I haven´t seen their data to say for certain (I would love for more journals to start displaying their full citation data!). But while it’s great to see a good number of PLOS ONE papers receiving very high numbers of citations, I think the more notable achievement is that we really are publishing all kinds of research, regardless of its estimated impact, and letting the community decide what is worthy of citation. With the usual flurry of Impact Factor announcements due to start any day now, it’s a good time to remember that it is the papers, not the journals they´re published in, that make the impact.

Graph: This is a kernel density estimation of citation distribution rather than actual numbers, hence the fact that it looks like some papers have received fewer than zero citations (credit Martin Fenner)

 

 

This entry was posted in article-level metrics, Functionality, Open Access and tagged , , , , , . Bookmark the permalink.

6 Responses to PLOS ONE – Measuring Article Impact

  1. Anders Eklund says:

    It looks like some papers have a negative number of citations…

    VA:F [1.9.22_1171]
    Rating: +1 (from 1 vote)
  2. Pingback: New record: 66 journals banned for boosting impact factor with self-citations : Nature News Blog

  3. Pingback: Somewhere else, part 62 | Freakonometrics

  4. Pingback: “it is the papers, not the journals they´re published in, that make the impact.” | lab ant

  5. Pingback: Open Access Publishing | Science Reverie

  6. Pingback: Twitter Open Access Report – 17 Jun 2013 |

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>