Open annakrystalli opened 5 years ago
As reproducibility increases, the time and effort taken to reproduce a paper might become neglible (eg the click of a binder button) and ideally automated. This opens the door for deeper treatment of the materials, potentially moving towards replications.
In any case, there are currently opportunities for more formalised and useful outputs from participants of a ReproHack. Some outputs can actually be publishable. This acts both as an incentive but more importantly recognises the value of the practice and the efforts of participants.
There are currently three options across two platformsfor directing formal outputs of a ReproHack
ReScience C is an open-access peer-reviewed journal that targets computational research and encourages the explicit replication of already published research, promoting new and open-source implementations in order to ensure that the original research is reproducible. What they consider a replication:
SciGen.Report: is a community based platform with one mission: to foster communication on reproducibility for the sound development of scientific knowledge. It's a portal allowing the reporting the results of attempts to reproduce research and the ability to view a reproducibility summary of individual papers.
ReScience C Replications involves publication of the new replication source code alongside a report detailed the procedure and any findings from the replication. According to their site, a replication involves:
Replicating an analysis using different software would involve a complete rewrite of the analysis. This might be prohibitive over one day, depending on analysis complexity, but:
Currently ReScience C focus on publishing replications. However:
A replication attempt is most useful if reproducibility has already been verified.
Given this, and given the limitations in time which might preclude a full replication of results, ReScience got in touch to propose a "Reproducibility Report". What should go in this is still up for discussion. Current items on the list are:
This somewhat reflects the feedback we ask for in the author feedback form which focuses on reproducibility, reusability, transparency. Perhaps the forms could inform additional considerations that could be included in the reports.
Because Reproducibility reports are likely to be much more standardised than Replicability reports, perhaps templates can be developed that both guide and make it quicker/easier to put the reports together in a day.
Scigen.report is by far the simplest and easiest output to integrate. The submission form is brief and simple to complete and many are covered in the feedback form. All participants need to do currently is sign up. Additionally, the data it collects seem really useful for meta-research on the state of reproducibility of the published.
More info in the intro slides