Opens Roundup (March 25)

Bookmark and Share

 

In this issue, the US FIRST act meets opposition on all fronts, David Wiley (champion of open education) on the 5Rs of openness, Wellcome releases its report on how to promote an effective market for APCs, PeerJ one year on, ten tips for navigating publishing ethics, the power of post-publication review, altmetrics for developing countries, the pros and cons of publishing in arXiv, a fantastic crowdsourcing project in the humanities, John Ioannidis on the look-out for a journal near you…., and more

With thanks to Allison Hawxhurst and Katie Fleeman for tips and links.

POLICY DEVELOPMENTS

US FlagUS: Why FIRST is a Trojan Horse

March 11: If you haven’t read PLOS’s reaction to the introduction of the FIRST Bill then here is another opportunity to read Cameron’s post (main link) as well as that of EFF published the same day and SPARC a couple of days before that. But it wasn’t just open access advocates objecting to the public access language in Section 303 (which could see embargo lengths increase to three years after publication). Science insider also lists the growing opposition coming from scholarly societies and various research and university groups noting that the overwhelming feeling is that the bill represents a missed opportunity for the US to maintain its scientific lead, in particular because of constraints on the amount of funding for NSF and the restrictions the bill imposes on the type of research it would fund. The bill, introduced by republicans, has been particularly divisive and at odds with the original  America Competes Act introduced by democrats. The ensuing markup of the FIRST Bill on the 12th was lively but an impassioned plea by Rep. Zoe Lofgren to strike the public access language in Section 303 was narrowly defeated by 9-8 along party lines. The bill still has a long way to go.

AND IN OTHER NEWS

Legal Dispute between a Professional Association and Press

March 16: On H Net, a listserv for the humanities, Phil Brown from H-Asia calls attention to an article in the Chronicle of Higher Education (available to subscribers only) discussing a dispute between the Social Science History Association and Duke University Press over control of the Association’s journal, Social Science History. Here’s the nub of what he says in full:

(After notifying Duke of its intent to solicit bids for a new publication contract):

In June 2012, not having gotten a bid from Duke, the association sent the press a letter saying it would be moving on and ending the contract; Duke disputed its right to do that. According to the association, Duke interprets that phrase “participation in the journal” to mean that the association has only two choices:  “continue with Duke as the sole publisher in perpetuity or give up its ownership of the journal altogether and allow Duke to continue to publish the journal,” even though the university “has no ownership or intellectual-property rights” in it. “

The full details of the suit are available.

Clarifying the 5th R

Image by Sean MacEntee (CC-BY 2.0)

Image by Sean MacEntee (CC-BY 2.0)

March 15: David Wiley, a proponent for open educational resources (OER), introduced the concept of the 4Rs as a way to think about openness in 2007. He described the four main activities enabled by open content as Reuse, Rework, Remix and Redistribute, which is what the creative commons attribution license permits. The 4Rs have been influential as a framework for educational resources  and what being open means more generally . On March the 4th he added a 5th R – Retain – to the mix. This is the right to make, own and control a copy of the work. This addition was in response to the fact that while many OER publishers permit access to their content they “do not make it easy to grab a copy of their OER that you can own and control forever”. “How many OER publishers enable access but go out of their way to frustrate copying? How many more OER publishers seem to have never given a second thought to users wanting to own copies, seeing no need to offer anything beyond access?”

Some (e.g. David Draper and Mike Caulfield) have subsequently questioned the need for a 5th R – surely being able to own and control a copy of the work is implicit in any licence that permits the 4Rs in the first place? Wiley agrees with this but believes that ownership per se has never been explicitly addressed in the discussion of open content (main link) and this has direct consequences: “If the right to Retain is a fundamental right, we should be building systems that specifically enable it. When you specifically add ‘Enable users to easily download copies of the OER in our system’ to your feature list, you make different kinds of design choices” – ones that most designers of OER systems have failed to think about.

The implications of the 5th R are fundamentally important as they change the context of what we’re doing – it’s no longer enough to state that others can reuse, remix or redistribute your work. It suggests that those involved in providing open content also need to help build or facilitate the infrastructure that enables the content to be easily reused by others. In other words, we have a responsibility to reduce the friction around the content we make open (e.g.  Cameron’s article in PLOS Biology about this). And this applies not just to libraries and institutions with an educational remit but to funders and publishers as well.

Metaphysicians: Sloppy researchers beware. A new institute has you in its sights

March 15: Article in the economist about a new institution at Stanford called the Meta-Research Innovation Center (Metrics) to be run by John Ioannidis (author of the famous ‘Why most published findings are false’ paper in PLOS Medicine, currently viewed almost 980,000 times). The new laboratory aims to find out whether attempts to reproduce studies actually work and to guide policy on how to improve the validation of research more generally. Among the initiatives is also “a ‘journal watch’ to monitor scientific publishers’ work and to shame laggards into better behaviour. And they will spread the message to policymakers, governments and other interested parties, “in an effort to stop them making decisions on the basis of flaky studies.”

PeerJ’s $99 open access model one year on

PeerJMarch 13: A brief review in Times higher of how PeerJ is doing. “PeerJ co-founder Jason Hoyt said that the journal had fulfilled its year-one aims of “staying alive”, starting to grow and “laying the groundwork to show this business model could be sustainable”. It is on track to be self-sustaining by early 2015, and there are no plans to raise prices. “If anything we would like to lower them if we can figure out some other revenue stream,” Dr Hoyt said.” PeerJ was also explicitly references in the Wellcome Trust report into and effective APC market by Björk and Solomon mentioned below. That report concludes that “PeerJ has published around 220 articles in the first nine months of operation. It is not clear at this point if the journal will scale up to a level that will make it financially sustainable but it offers an innovative funding model”.

Institutional repositories provide an ideal medium for scholars to move beyond the journal article

Academic Commons Use-per-item graph (CC BY 3.0)

Academic Commons Use-per-item graph (CC BY 3.0)

March 12: Leyla Williams, Kathryn Pope, and Brian Luna Lucero from Columbia University make the case for institutional repositories to collect all the work of their scholars rather than focusing on only peer reviewed journal articles or monographs. They discuss how their IR ‘Academic Commons’ hosts conference videos, presentations, and technical reports and other “grey literature” (as the figure shows). “IRs are crucial for authors whose work may not fit within the scope of any one scholarly journal.”, they note. “They are also vital for researchers with data that lies outside the parameters of disciplinary data repositories, for dissertation authors who want to make supplemental materials available, and for undergraduates.” They discuss examples where deposition in their repository has helped young researchers find a job and how the lively twitter feed around what’s deposited helps disseminate the work.

Top 10 tips for navigating ethical challenges in scholarly publishing

Plaigiarism By Brett jordan

Image by Brett Jordan (CC BY)

March 12: Jackie Jones, Executive Journals Editor at Wiley provides her top tips on publication ethics to coincide with the 2nd edition of Wiley’s “Best Practice Guidelines on Publishing Ethics: A Publisher’s Perspective”.  The guidelines (available online or as a PDF) are published under a Creative Commons Non-Commercial license and have drawn on a range of expertise including from COPE. They form a useful addition to the more in-depth guidelines and forums that COPE already provides and signal the increasing accountability that all reputable publishers have to ensure high ethical standards in scholarly publishing. In a subsequent post, Udo Schuklenk an editor of the journal Bioethics and Developing World Bioethics lists some of the infringements he’s seen as editor with respect to plagiarism:  “Over the years you begin to delude yourself into thinking that you have seen the full range of ethics infringements.  It’s particularly ironic, I guess, when you edit bioethics journals: you would hope that your authors would be clued in to publication ethics issues.” Unfortunately no journal can escape these issues.

Fostering a transparent and competitive market for open access publishing

Image by 401(K) 2013 (CC-BY-SA 2.0)

Image by 401(K) 2013 (CC-BY-SA 2.0)

March 12: A range of funders, led by the Wellcome Trust, released a report by Bo-Christer Björk of the Hanken School of Economics, Finland, and Professor David Solomon of Michigan State University on the structure and shaping of the APC market. It’s worth skimming the whole report as it has a lot of good information as well as giving a sense of the directional thinking of funders. The report contains useful figures and data.

The key conclusions are that the full OA market is competitive and functioning with varied pricing depending on field and impact. The market for APCs of articles in hybrid journals run by subscription publishers is dysfunctional with relatively flat pricing (and low uptake). A set of analyses of the report have started to appear and look out for a some comments and a summary here over the next few days.

Science self-corrects – instantly

Image by Boston Public Library (CC BY 2.0)

Image by Boston Public Library (CC BY 2.0)

March 11:  The  blog for PubPeer, an online post-publication commenting service, discusses two papers published in Nature in January purporting to provide a revolutionary simple technique for producing pluripotent stem cells, termed STAP cells. A post by Paul Knoepfler on his own blog expressed initial doubts which were fuelled by comments on PubPeer exposing problems with the figures in the paper. Knoepfler then hosted a post to crowdsource researchers trying to replicate the apparently simple production of mouse STAP cells – so far with little success. And then comments posted last week suggested that some of the figures in one of the Nature papers were duplicated from seemingly different experiments in the lead author’s doctoral thesis. Not long after, the Wall Street journal reported that the senior co- author from RiKEN (a leading Japanese research institute) asked for the papers to be retracted though a co-author at Harvard continues to support the work. Nature also reported on the press conference with RIKEN, who announced the findings of an interim investigation but have not made any decision about retracting the papers. The story rumbles on – this week, RIKEN withdrew the original press release about the paper stating “The reports described in the CDB news story “STAP cells overturn the pluripotency paradigm” published on Jan 30, 2014 are currently under investigation. Due to a loss of confidence in the integrity of these publications, RIKEN CDB has withdrawn press release from its website.”

The moral of the story is not just whether the papers in Nature hold up but whether commenting platforms like PubPeer and blogs provide a valid means to scrutinise bold claims. While there are cases where issues are identified there is also concern about the potential for personal attacks, particularly given the anonymity that some of these platforms provide.

This is especially important given the increasing number of times pre-publication review actually fails. In a recent and related post in Nature, Richard van Noorden discusses how to choose among the many venues now available for researchers to discuss work after publication, focusing on a discussion of the same papers by Kenneth Lee on ResearchGate. ResearchGate provided Lee with a more structured review form, they’re calling Open Review. PLOS Labs has also recently set up a similar innovative initiative, called ‘Open Evaluation’. As PubPeer conclude at the end of the post (main link) “Science is now able to self-correct instantly. Post-publication peer review is here to stay”

How is it possible that Elsevier are still charging for copies of open-access articles?

March 11: Mike Taylor provides a run-down of the charges that Elsevier still try to apply for reusing their ‘Open Access’ Articles. Apparently, it’s all to do with a problematic technical fix that Elsevier has been trying to solve for a couple of years now. And this week Peter Murray Rust publishes a letter he received from Elsevier’s Director of Access and Policy Alicia Wise, about how they have now taken steps to fix the problem and compensate individuals who have been mis-sold Open Access products….

Should we eliminate the Impact Factor?

March Issue: An interesting take on the impact factor by the Editor in Chief of Biotechniques, Nathan Blow. He reviews the pros and cons of the impact factor versus other article level metrics and concludes that there is an equal danger of misuse if researchers get wedded to any single alternative metric. And he thinks they will because scientists need something on which to base their publishing decisions. What he ends up calling for is a bit more sense in the way we use metrics, and a bit less laziness: “we need to change the way scientists view such metrics: While it might be good to publish in a top tier journal with an Impact Factor of 30—if your article only gets 2 citations, what does this mean? And the opposite is also true—if the journal has an Impact Factor of 2, but your article receives 500 citations in 2 years, should you be penalized for where you publish? And fundamentally, what does it mean to get 2 versus 500 citations? The validity of any statistic or analysis tool depends on careful and appropriate application by an informed user. Maybe scientists need to look beyond sheer numbers towards the “community” impact of their studies. Here, network analysis showing the reach of an article based on a deeper citation analysis might provide stronger insights into its impact. Tenure committee members also need to look beyond the simple “30-versus-2” Impact Factor debate and use their experience and knowledge to see the true contribution that a scientist is making to their field and beyond—you cannot ask a young scientist to do something that you are not willing to do yourself! In the end, measures such as the Impact Factor are only “lazy” statistics because we make them lazy.”

While I agree with much of what he says, I think he omits another factor that scientists should consider when they choose where to publish and that’s the service the journal (or platform) provides Are there the checks and balances that ensure your work is properly reported, what sort of  peer- review service do they have, does the publisher help ensure that the data underlying the paper’s findings are available, is the metadata in a form that means your article or the components within it can be found by anyone interested regardless of subject or geographic location, and can the content be reused easily if others do find it. Making sure your work can be validated, disseminated and is searchable and reusable is what really counts. The metrics will follow.

Is a Rational Discussion of Open Access Possible?

Image by Vaguery (CC-BY 2.0)

Image by Vaguery (CC BY 2.0)

March 10: A dedicated blog set up by Rick Anderson to host the slides and transcripts of a talk he gave at the Smithsonian Libraries. Rick is perhaps better known as a chef for Scholarly Kitchen (where he’s posted a link to the video of his talk). Both blogs have a lively set of comments – largely supportive ones from Mike Taylor for example even though he was criticised in the talk and some insights from Jan Velterop (who started BMC with Vitek Tracz).

Altmetrics could enable scholarship from developing countries to receive due recognition

March 10: Fantastic post by Juan Pablo Alperin on the potential impact of altmetrics for researchers in developing countries. One of the issues he raises is the perception that researchers in developing countries don’t produce as much research but this is largely because the research they do produce is not represented in abstracting and indexing services such as Thomson Reuters’ Web of Science. This means that the work doesn’t get cited as much and the journals don’t even gain entry into the notorious impact factor game. Resources like SciELO are trying to redress this balance but are still working with only a subset of the 5000+regional journals (largely from S. America). He provides a striking image (below) of the world scaled by the number of papers in  Web of Science by authors actually living there which puts the lack of representation in these countries into stark relief.

World Scaled  Image by Juan Pablo Alperin (cC-BY)

World Scaled Image by Juan Pablo Alperin (CC BY)

But whether altmetrics can help redress this balance is open to question. The potential is huge but to realise the promise, he argues, altmetrics (and the ALM community more generally) need to engage with scholars from developing regions. He cautions that if altmetrics are used as just another means to rank scholars then the danger is that they will evolve to cater only for those in locations where they are most heavily used (i.e. not in developing countries). However he is part of the Public Knowledge Project working with the PLOS’ ALM application to provide a free altmetrics service to journals being run and hosted in developing countries (via the OJS platform). “As the field begins starting to consolidate, I remain optimistically pro-altmetrics for developing regions, and I have faith in the altmetrics community to serve all scholars. Which directions altmetrics should go, how they should be used, or how the tools should be implemented is not for me to prescribe, but if we exclude (or do not seek to include) scholars from developing regions, altmetrics will become another measure from the North, for the North. And we already know that story.”

Dubiously online

March 08: Article about the need to police open access journals in India by an Indian Academic, Rudrashis Datta: “Unless the higher education authorities step in to stem the rot, serious open access publishing, so important in the Indian academic context, runs the risk of dying a natural death, leaving serious researchers and academics without the advantage of having to showcase their scholarly contributions to readers across the world.“

The price of publishing with arXiv

March 05: Mathematician discussing the advantages and disadvantages of publishing in arXiv: “The advantage: I had a lot of fun. I wrote articles which contain more than one idea, or which use more than one field of research. I wrote articles on subjects which genuinely interest me, or articles which contain more questions than answers. I wrote articles which were not especially designed to solve problems, but to open ones. I changed fields, once about 3-4 years.”….”The price: I was told that I don’t have enough published articles…”

“But, let me stress this, I survived. And I still have lots of ideas, better than before, and I’m using dissemination tools (like this blog) and I am still having a lot of fun.”

Making it Free, Making it Open: Crowdsourced transcription project leads to unexpected benefits to digital research

Image by Ewan Munro (CC BY-SA)

Image by Ewan Munro (CC BY-SA)

March 03: Melissa Terras Professor of Digital Humanities at University college London.discusses how they used crowdsourcing to transcribe all of philosopher and reformer, Jeremy Bentham’s writings.  The ‘Trancscribe Bentham’ site is hosted by UCL. It is a fantastic project and seems likely to follow the success of similar crowdsourcing initiatives in the sciences like GalaxyZoo. As Mellisa notes, “This week we hit over 7000 manuscripts transcribed via the Transcription Desk, and a few months ago we passed the 3 million words of transcribed material mark. So we now have a body of digital material with which to work, and make available, and to a certain extent play with. We’re pursuing various research aims here – from both a Digital Humanities side, and a Bentham studies side, and a Library side, and  Publishing side. We’re working on making canonical versions of all images and transcribed texts available online.  Students in UCL Centre for Publishing are (quite literally) cooking up plans from what has been found in the previously untranscribed Bentham material, unearthed via Transcribe Bentham. What else can we do with this material?” And there are lots of doors opening for them too – such as looking into Handwritten Text Recognition (HTR) technologies.

Experiment in open peer review for books suggests increased fairness and transparency in feedback process

Feb 28: Hazel Newton, the Head of Digital Publishing at Palgrave Macmillan describes their current peer review pilot investigating how open feedback functions in monograph publishing and gets feedback from authors involved in project. Great to see open peer review experiments in the humanities as well as the sciences.

Creative Commons License
Opens Roundup (March 25) by PLOS Opens, unless otherwise expressly stated, is licensed under a Creative Commons Attribution 4.0 International License.

This entry was posted in altmetrics, News, Open Access, Open Access policy, Uncategorized and tagged , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>