Over the holidays, I saw an interesting article on F1000Research
Soergel DAW. Rampant software errors undermine scientific results [v1; ref status: approved with reservations 2, http://f1000r.es/4w2] F1000Research2014, 3:303 (doi: 10.12688/f1000research.5930.1)
and felt inspired to comment on it. The editors asked if I would use my comment as the basis for a review, which I agree to, leading to my first open peer review:
Katz D. Referee Report For: Rampant software errors undermine scientific results [v1; ref status: approved with reservations 2, http://f1000r.es/4w2] F1000Research 2014, 3:303 (doi: 10.5256/f1000research.6338.r7096)
This made me think about the peer review process, and what specifically was different about open peer reviews (where the reviewer is known) vs traditional closed peer reviews (where the review is anonymous.)
I mentioned this in a tweet, and someone pointed me to
Aleksic J, Alexa A, Attwood TK et al. The Open Science Peer Review Oath [v1; ref status: indexed, http://f1000r.es/4ou] F1000Research 2014, 3:271 (doi: 10.12688/f1000research.5686.1)
which led me think about this even more, leading the following, which is also posted as a comment on the Open Science Peer Review Oath paper. Comments are welcome.
Daniel S. Katz,
- xiii) I will check that the data, software code and digital object identifiers are correct, and the models presented are archived, referenced, and accessible
- xiv) I will comment on how well you have achieved transparency, in terms of materials and methodology, data and code access, versioning, algorithms, software parameters and standards, so that your experiments can be repeated independently
The idea of an open review here seems to be that, in addition to ensuring integrity in the review process, there is also extra work being done beyond a standard review, and the person who does such work should be credited for doing so.
I am uncertain about where elements xv and xvi, as written:
- xv) I will encourage deposition with long-term unrestricted access to the data that underpin the published concept, towards transparency and re-use;
- xvi) I will encourage central long-term unrestricted access to any software code and support documentation that underpin the published concept, both for reproducibility of results and software availability
would fit. These are neither active nor passive, and as written, they don’t match the review function, but are even more active, more a collaboration than a review. I suggest that they be rephrased as:
- xv) I will check that the data that underpin the published concept are made available in a manner that provides long-term unrestricted access, towards transparency and re-use;
- xvi) I will check that any software code and support documentation that underpin the published concept are made available in a manner that provides long-term unrestricted access, both for reproducibility of results and software availability
so that they could be part of an Active review.
Of course, this specific remedy is just a suggestion, but I the overall point I want to make is that the added work to be done by the reviewer beyond what is now standard needs to be explicitly considered in both the oath itself as well as the description of the oath.