Closed whedon closed 4 years ago
Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @hayesla, @mattpitkin it looks like you're currently assigned to review this paper :tada:.
:warning: JOSS reduced service mode :warning:
Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.
:star: Important :star:
If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿
To fix this do the following two things:
For a list of things I can do to help you, just type:
@whedon commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@whedon generate pdf
@ivalaginja, @hayesla, @mattpitkin : this is the review thread for the paper. All of our communications will happen here from now on.
Both reviewers have checklists at the top of this thread with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.
The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews#2281
so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.
We aim for reviews to be completed within about 2-4 weeks, but that's not a hard deadline. Please let me know if any of you require some more time. We can also use Whedon (our bot) to set automatic reminders if you know you'll be away for a known period of time.
Please feel free to ping me (@xuanxu) if you have any questions/concerns.
@whedon check references
Reference check summary:
OK DOIs
- 10.1038/nature16068 is OK
- 10.1088/0004-637X/814/1/66 is OK
- 10.3847/1538-3881/ab7b78 is OK
- 10.3847/1538-3881/aa9e4e is OK
- 10.1126/science.aah4668 is OK
- 10.1038/s41586-018-0067-5 is OK
- 10.3847/0004-637X/819/1/10 is OK
- 10.1093/mnras/stu1975 is OK
- 10.1086/345520 is OK
- 10.1051/0004-6361/200913675 is OK
- 10.1051/0004-6361/201423804 is OK
- 10.5281/zenodo.2573885 is OK
- 10.1117/12.447161 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.5281/zenodo.2893252 is OK
- 10.5281/zenodo.3644238 is OK
- 10.1109/MCSE.2011.37 is OK
- 10.1051/0004-6361/201322068 is OK
- 10.3847/1538-3881/aabc4f is OK
- 10.5281/zenodo.3378022 is OK
- 10.1088/1538-3873/aa65b0 is OK
- 10.5281/zenodo.157363 is OK
- 10.5281/zenodo.35265 is OK
- 10.1086/683602 is OK
- 10.1088/1538-3873/aaf5ad is OK
MISSING DOIs
- https://doi.org/10.25080/majora-92bf1922-00a may be missing for title: Data structures for statistical computing in python
INVALID DOIs
- None
@ivalaginja The installation instructions work well. I'd just recommend explicitly adding how to activate the conda environment before running setup.py:
conda activate exoticism
python setup.py develop
This isn't a requirement, but are there plans to have continuous integration (e.g., through Travis) that can run your test suite? Do you know how much of the main functionality of the code the current test suite covers?
For some references they don't seem to render the journal. I think this is due to the paper.bib
file containing the AAS macros (e.g., \mnras
, \aap
) for several papers - these macros should be replaced with the full or abbreviated journal name.
There are also a couple that need the DOI added:
The functionality of the code, and required inputs in the configuration file, are well documented in the tutorial notebook, so I have ticked the Functionality documentation check box. There is no specific documentation for the API, so in the future I would recommend there is an autogenerated doc page that does contain the API, but this is not required for this review.
Once the references are fixed I'm happy to sign-off the review.
Just to summarise and slightly add to what I've written above, I've a couple of general recommendations but these are not requirements for the review:
One very minor typo fix for the paper: in the first sentence change "has" to "have".
I should just note that I ran the example on a remote machine (my laptop's running low on space, so I had to ssh into my work desktop to clone the repo there!). When I tried running the example it performed the first stage of fitting, but hung when it got to the second. I got things to work by changing to plotting = False
in the config file, after which it ran through everything ok.
Thanks for all your comments @mattpitkin! To address your questions:
pytest
. Our current test coverage is very thin and probably in the lower single digits percent-wise. However, we aim at improving on that in the future which is why we already set up CI, so that all that's left is actually to write more senseful tests. We have an open issue for this here.paper.bib
pandas
paper but can partout not find the DOI for the Claret 2000 paper. I have reached out to an author who recently cited it to ask whether they know the DOI, but I am not sure this will help. I will ask @hrwakeford (co-author) if she knows more. What would be our alternative if we can't find the DOI?I just saw that Doe et al. also doesn't have a DOI, facing the same problem here.
Thanks for your detailed responses and for opening the issue.
I think you're right that there just aren't DOIs for Claret or Doe et al, so the references are fine without them. I'll tick of that last box for my review.
@whedon check references
Reference check summary:
OK DOIs
- 10.1038/nature16068 is OK
- 10.1088/0004-637X/814/1/66 is OK
- 10.3847/1538-3881/ab7b78 is OK
- 10.3847/1538-3881/aa9e4e is OK
- 10.1126/science.aah4668 is OK
- 10.1038/s41586-018-0067-5 is OK
- 10.3847/0004-637X/819/1/10 is OK
- 10.1093/mnras/stu1975 is OK
- 10.1086/345520 is OK
- 10.1051/0004-6361/200913675 is OK
- 10.1051/0004-6361/201423804 is OK
- 10.5281/zenodo.2573885 is OK
- 10.1117/12.447161 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.5281/zenodo.2893252 is OK
- 10.25080/Majora-92bf1922-00a is OK
- 10.5281/zenodo.3644238 is OK
- 10.1109/MCSE.2011.37 is OK
- 10.1051/0004-6361/201322068 is OK
- 10.3847/1538-3881/aabc4f is OK
- 10.5281/zenodo.3378022 is OK
- 10.1088/1538-3873/aa65b0 is OK
- 10.5281/zenodo.157363 is OK
- 10.5281/zenodo.35265 is OK
- 10.1086/683602 is OK
- 10.1088/1538-3873/aaf5ad is OK
MISSING DOIs
- None
INVALID DOIs
- None
The reference to Claret 2000 has two sources an A&A paper and a catalog in VizieR Paper - https://ui.adsabs.harvard.edu/abs/2000A%26A...363.1081C/abstract VizieR - http://vizier.u-strasbg.fr/viz-bin/VizieR?-source=J/A+A/363/1081
The Sherpa package lists the following link as having the correct DOI for citation but I struggled to navigate to where that might be but this is where to start https://zenodo.org/record/3631574#.XvC_bi-ZNhE
You're right about Sherpa, however this is only for the package itself, for which we already added the DOI; what's missing is the DOI for the paper they want us to cite.
As for Claret, it really seems like there's no DOI to be found :P
I've checked my last tick box, so my sign off is complete.
my apologies for my delay - I will work on this today!
@whedon generate pdf
I think the pandas 2020 reference should be
@software{jeff_reback_2020_3644238,
author = {Jeff Reback and
Wes McKinney and
jbrockmendel and
Joris Van den Bossche and
Tom Augspurger and
Phillip Cloud and
gfyoung and
Sinhrks and
Adam Klein and
Matthew Roeschke and
Jeff Tratner and
Chang She and
Simon Hawkins and
William Ayd and
Terji Petersen and
Jeremy Schendel and
Andy Hayden and
Marc Garcia and
MomIsBestFriend and
Vytautas Jancauskas and
Pietro Battiston and
Skipper Seabold and
chris-b1 and
h-vetinari and
Stephan Hoyer and
Wouter Overmeire and
alimcmaster1 and
Mortada Mehyar and
Kaiqi Dong and
Christopher Whelan},
title = {pandas-dev/pandas: Pandas 1.0.1},
month = feb,
year = 2020,
publisher = {Zenodo},
version = {v1.0.1},
doi = {10.5281/zenodo.3644238},
url = {https://doi.org/10.5281/zenodo.3644238}
}
Overall looks good to me. The Quickstart guide worked well, and I was able to go through the example notebook easily after this and play around with the codebase.
Once the reference (above) is fixed I have nothing blocking and just some suggested comments
I agree with @mattpitkin, a specific description of the API, or even just an overview in the README would be good to make it clear what is available. And of course something like an automated documentation of the API, maybe https://readthedocs.org/ is something to consider for the future.
For the Community Guidelines I would suggest adding a few sentences to the README explicitly mention where users can report issues or problems seek support, maybe within another heading?
The tutorial notebook is great and is well documented to describe the functionality of the code and apply it to some nice examples. I would certainly suggest moving this to some documentation somewhere else in the future also.
Future continuous integration functionality would be great to see.
@hayesla thank you for your comments! I fully agree with you on all you said, and we will work in the near future to get these things set up; issues for this already exist in the repository.
As for the reference, thanks for pointing it out. I will for sure update the author list, however, the DOI and the version wouldn't work for our paper since we used pandas version v0.24.2, for which there is no explicit Zenodo entry. My idea was to stick in the concept DOI instead, which will always point to the most recent release since there is no DOI for the particular version we use. And I could adjust the version number in the Bibtex manually.
Would that be an acceptable solution?
I'm not too sure what you mean - as the DOI that is listed in paper for (team, T. pandas development. (2020). Pandas-dev/pandas: Pandas 1.0.1. Zenodo. doi:10. 5281/zenodo.3644238
) to here https://zenodo.org/record/3644238#.XvNnWGpKhTY which in the bottom left hand corner has a bibtex entry
Maybe I am misunderstanding?
If you look on the right side of that very page, in the box labelled "Versions", you will see a list of all the published versions of pandas. Each version has a unique DOI that should be cited depending on the version you use. We are using a version that was not separately published (0.24.2), so it has no unique DOI. At the bottom of this box the "concept DOI" is shown, with additional information on how that is handled. Since there is no DOI for the version we use, I thought it would be better to cite the concept DOI rather than wrongly point to a version we are not actually using?
yes - so I guess the issue is that the one that is currently in the paper is doi:10.5281/zenodo.3644238
- however from the cite all versions in the box you are referring to is https://doi.org/10.5281/zenodo.3509134
which is different?
Sorry about my confusion with this.
Maybe @xuanxu can clarify?
The “concept” DOI is 10.5281/zenodo.3509134
I think that’s the one you want to use here. In the environment.yml
file there is no version specified for Pandas, so it makes sense to use the DOI for “all versions”.
ok great thanks @xuanxu - once this is updated my sign off is complete 🚀
Thanks a lot to both - I will update asap and let you know when it's ready.
@hayesla I have updated the reference, thanks for pointing out the inconsistency!
@whedon generate pdf
@whedon check references
Reference check summary:
OK DOIs
- 10.1038/nature16068 is OK
- 10.1088/0004-637X/814/1/66 is OK
- 10.3847/1538-3881/ab7b78 is OK
- 10.3847/1538-3881/aa9e4e is OK
- 10.1126/science.aah4668 is OK
- 10.1038/s41586-018-0067-5 is OK
- 10.3847/0004-637X/819/1/10 is OK
- 10.1093/mnras/stu1975 is OK
- 10.1086/345520 is OK
- 10.1051/0004-6361/200913675 is OK
- 10.1051/0004-6361/201423804 is OK
- 10.5281/zenodo.2573885 is OK
- 10.1117/12.447161 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.5281/zenodo.2893252 is OK
- 10.25080/Majora-92bf1922-00a is OK
- 10.5281/zenodo.3509134 is OK
- 10.1109/MCSE.2011.37 is OK
- 10.1051/0004-6361/201322068 is OK
- 10.3847/1538-3881/aabc4f is OK
- 10.5281/zenodo.3378022 is OK
- 10.1088/1538-3873/aa65b0 is OK
- 10.5281/zenodo.157363 is OK
- 10.5281/zenodo.35265 is OK
- 10.1086/683602 is OK
- 10.1088/1538-3873/aaf5ad is OK
MISSING DOIs
- None
INVALID DOIs
- None
@ivalaginja I have issued a couple pull requests in the ExoTiC-ISM repo with minor changes. Please take a look at them and consider merging them if you think they are OK.
@xuanxu I had a look at them, double-checked and merged.
OK @ivalaginja, everything looks good now, here are the next steps:
Once you do that please report here the version number and archive DOI
v2.0.0 10.5281/zenodo.3923986
@whedon set v2.0.0 as version
OK. v2.0.0 is the version.
@whedon set 10.5281/zenodo.3923986 as archive
OK. 10.5281/zenodo.3923986 is the archive.
Thanks @hayesla and @mattpitkin for your reviews!
@whedon accept
Attempting dry run of processing paper acceptance...
:wave: @openjournals/joss-eics, this paper is ready to be accepted and published.
Check final proof :point_right: https://github.com/openjournals/joss-papers/pull/1522
If the paper PDF and Crossref deposit XML look good in https://github.com/openjournals/joss-papers/pull/1522, then you can now move forward with accepting the submission by compiling again with the flag deposit=true
e.g.
@whedon accept deposit=true
👋 @ivalaginja - here are some suggested changes for the paper:
In addition, please fix the cases in the references, for example "Astronomical data analysis software and systems xvi, Astronomical society of the pacific conference series" You will need to do this by protecting (with {}s) cases in the .bib file. A bunch of journals also need to be fixed, as we don't know or expand standard astro abbreviations, such as aap, apj, apjl, mnras, etc.
once you've done this, please issue @whedon generate pdf
(and iterate if needed), then let me know it's ready to go.
@ivalaginja - this is just a reminder that we are waiting on the changes to the paper and bib as requested in the issue above, then your paper can be published
Thanks for the reminder! I did not forget, as a matter of act it's on the top of my work todo list, I was just relocating this week so I am a little short on time. I should be able to get to it this weekend or Monday!
Submitting author: @ivalaginja (Iva Laginja) Repository: https://github.com/hrwakeford/ExoTiC-ISM Version: v2.0.0 Editor: @xuanxu Reviewer: @hayesla, @mattpitkin Archive: 10.5281/zenodo.3923986
:warning: JOSS reduced service mode :warning:
Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.
Status
Status badge code:
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@hayesla & @mattpitkin, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @xuanxu know.
✨ Please try and complete your review in the next six weeks ✨
Review checklist for @hayesla
Conflict of interest
Code of Conduct
General checks
Functionality
Documentation
Software paper
Review checklist for @mattpitkin
Conflict of interest
Code of Conduct
General checks
Functionality
Documentation
Software paper