Open nuest opened 2 years ago
oo, great find!
On Thu, Feb 24 2022, Daniel Nüst wrote:
https://www.nature.com/articles/s41597-022-01143-6
image
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.
Peer review: a flawed process at the heart of science and journals
The Peer-Review Crisis By Colleen Flaherty June 13, 2022;
The peer-review system, which relies on unpaid volunteers, has long been stressed. COVID-19 is making it worse—a lot worse. Possible solutions include paying reviewers or limiting revise-and-resubmits, but are these just Band-Aids on bigger structural problems?
Reproducibility standards for machine learning in the life sciences - Benjamin J. Heil, Michael M. Hoffman, Florian Markowetz, Su-In Lee, Casey S. Greene & Stephanie C. Hicks - Nature Methods (2021)
To make machine-learning analyses in the life sciences more computationally reproducible, we propose standards based on data, model and code publication, programming best practices and workflow automation. By meeting these standards, the community of researchers applying machine-learning methods in the life sciences can ensure that their analyses are worthy of trust.
https://www.nature.com/articles/s41592-021-01256-7
Comment on this by Ben M.:
The authors propose three standards, gold, silver, and bronze, based on data, code and model availability. Looks like a useful tool to help with doing peer review of papers using ML:
IMO: the classification could be useful to clearly communicate what a CODECHECK did (not) check.
https://www.brainiacsjournal.org/arc/pub/Craig2022MMEEPR
Scientists who engage in science and the scientific endeavor should seek truth with conviction of morals and commitment to ethics. While the number of publications continues to increase, the number of retractions has increased at a faster rate. Journals publish fraudulent research papers despite claims of peer review and adherence to publishing ethics. Nevertheless, appropriate ethical peer review will remain a gatekeeper when selecting research manuscripts in scholarly publishing and approving research applications for grant funding. However, this peer review must become more open, fair, transparent, equitable, and just with new recommendations and guidelines for reproducible and accountable reviews that support and promote fair citation and citational justice. We should engineer this new peer-review process with modern informatics technology and information science to provide and defend better safeguards for truth and integrity, to clarify and maintain the provenance of information and ideas, and to rebuild and restore trust in scholarly research institutions. Indeed, this new approach will be necessary in the current post-truth era to counter the ease and speed with which mis-information, dis-information, anti-information, caco-information, and mal-information spread through the internet, web, news, and social media. The most important question for application of new peer-review methods to these information wars should be ‘Who does what when?’ in support of reproducible and accountable reviews. Who refers to the authors, reviewers, editors, and publishers as participants in the review process. What refers to disclosure of the participants' identities, the material content of author manuscripts and reviewer commentaries, and other communications between authors and reviewers. When refers to tracking the sequential points in time for which disclosure of whose identity, which content, and which communication at which step of the peer-review process for which audience of readers and reviewers. We believe that quality peer review, and peer review of peer review, must be motivated and maintained by elevating their status and prestige to an art and a science. Both peer review itself and peer review analyses of peer reviews should be incentivised by publishing peer reviews as citable references separately from the research report reviewed while crossreferenced and crosslinked to the report reviewed.
https://www.biorxiv.org/content/10.1101/2022.10.11.511695v2.full
A workflow reproducibility scale for automatic validation of biological interpretation results Hirotaka Suetake, Tsukasa Fukusato, Takeo Igarashi, Tazro Ohta https://doi.org/10.1101/2022.10.11.511695
https://researchonresearch.org/tpost/4vn3fxo0d1-the-future-of-peer-review-is-being-pulle
https://doi.org/10.31235/osf.io/8hdxu
Kaltenbrunner, W., Pinfield, S., Waltman, L., Woods, H. B., & Brumberg, J. (2022, January 22). Innovating peer review, reconfiguring scholarly communication: An analytical overview of ongoing peer review innovation activities. https://doi.org/10.31235/osf.io/8hdxu
Another key finding was differences in levels of acceptance of a mandatory code sharing policy between research fields, and career stage (as determined by the number of previous publications respondents had published). Medical researchers reported being less likely to submit to the journal if it had a mandatory code sharing policy, as did researchers with more than 100 publications. Whereas, researchers with fewer than 20 published papers showed more positive responses towards submitting to the journal if it implemented a code sharing policy. Other studies have found greater affinity for open research amongst early career researchers, including a 2021 peer-reviewed survey of Early Career Researchers (ECRs) within the Max Planck society, which concluded ECRs seem to hold a generally positive view toward open research practices.
When should data and code be made available? Rachel Heyard, Leonhard Held https://doi.org/10.1111/1740-9713.01623
Crüwell, S., Apthorp, D., Baker, B. J., Colling, L., Elson, M., Geiger, S. J., Lobentanzer, S., Monéger, J., Patterson, A., Schwarzkopf, D. S., Zaneva, M., & Brown, N. J. L. (2023). What’s in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science. Psychological Science, 0(0). https://doi.org/10.1177/09567976221140828
https://doi.org/10.1162/dint_a_00133
Limor Peer, Claudia Biniossek, Dirk Betz, Thu-Mai Christian; Reproducible Research Publication Workflow: A Canonical Workflow Framework and FAIR Digital Object Approach to Quality Research Output. Data Intelligence 2022; 4 (2): 306–319. doi: https://doi.org/10.1162/dint_a_00133
https://www.nature.com/articles/s41597-022-01143-6