Open rht opened 7 years ago
Sure, I can upload the plotting scripts. They've bit rotted a bit so I have to fix them up. This whole thing was originally an IPython notebook and then I slowly pulled chunks out into separate files.
I just pushed some stuff reorganizing most of the code into a cliqs
folder with run_mindep.py
on the outside. Sorry not to go through the pull request process--I'm not quite a github pro yet so I wasn't sure how to make a pull request on my own repo.
OK, I put in the analysis scripts.
I see, I could almost run mindep_plots here: https://github.com/rht/cliqs/blob/jupyter/mindep_plots.ipynb (or https://nbviewer.jupyter.org/github/rht/cliqs/blob/jupyter/mindep_plots.ipynb). Each plot is chunked into separate cells.
stat_smooth(method="auto", mapping=aes(colour=real)) +
is commented out because otherwise the plots couldn't be rendered. I wonder if there is a dependency package that needs to be downloadedError in
$<-.data.frame(
tmp, "p.less.than", value = "< .001"): replacement has 1 row, data has 0
(you have to activate the travis build at https://travis-ci.org/Futrell/cliqs)
If I were to use one of the classifications of reproducibility in http://ropensci.github.io/reproducibility-guide/sections/introduction/:
[1] https://gigascience.biomedcentral.com/articles/10.1186/s13742-016-0135-4 [2] http://www.sciencedirect.com/science/article/pii/S0167739X16000029
@Futrell, WDYT of publishing the code (maybe in a jupyter nb) that was used to create the plots summarizing the post-processed output? What if
\exists
a "reproducibility number" for any paper, where its count is increased whenever a peer has just validated its result. I haven't fully fleshed out yet what should be the sufficient criterion of validating a result, or if there are stages/hierarchies of criteria (perhaps it is in between of verifying a result and falsifying a result). At least this should be about checking against systematic bugs, as opposed to attesting whether a discovery is 5-sigma certain. This could complement one rough measure of a scientific consensus, e.g. (citation number / size of a field). ...(in short, I meant, request for the code for the fancy plots!)