Closed editorialbot closed 11 months ago
Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.
For a list of things I can do to help you, just type:
@editorialbot commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@editorialbot generate pdf
Software report:
github.com/AlDanial/cloc v 1.88 T=0.11 s (775.3 files/s, 281992.0 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
JavaScript 15 2433 2497 9214
HTML 19 1540 54 7508
SVG 1 0 0 2671
Python 19 600 783 1206
CSS 4 185 35 762
XML 1 0 2 711
TeX 1 18 0 350
reStructuredText 12 161 106 293
YAML 9 31 47 228
Markdown 5 58 0 145
TOML 1 0 1 3
-------------------------------------------------------------------------------
SUM: 87 5026 3525 23091
-------------------------------------------------------------------------------
gitinspector failed to run statistical information for the repository
Wordcount for paper.md
is 1061
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1016/B978-1-55860-247-2.50037-1 is OK
- 10.1007/3-540-57868-4_57 is OK
- 10.1016/J.CJCA.2021.09.004 is OK
- 10.48550/arxiv.1602.04938 is OK
- 10.48550/arxiv.2010.07389 is OK
- 10.1613/JAIR.1.12228 is OK
- 10.1109/ACCESS.2020.2976199 is OK
- 10.1145/3351095.3375624 is OK
- 10.3389/FDATA.2021.688969 is OK
- 10.1109/iccv.2017.74 is OK
- 10.5281/ZENODO.6344451 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.1038/s41586-020-2649-2 is OK
MISSING DOIs
- 10.1109/cvpr.2017.354 may be a valid DOI for title: Network Dissection: Quantifying Interpretability of Deep Visual Representation
INVALID DOIs
- None
@hbaniecki, @aksholokhov – This is the review thread for the paper. All of our communications will happen here from now on.
Please read the "Reviewer instructions & questions" in the first comment above. Please create your checklist typing:
@editorialbot generate my checklist
As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.
The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention https://github.com/openjournals/joss-reviews/issues/5873
so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.
We aim for the review process to be completed within about 4-6 weeks but please make a start well ahead of this as JOSS reviews are by their nature iterative and any early feedback you may be able to provide to the author will be very helpful in meeting this schedule.
Five most similar historical JOSS papers:
Yellowbrick: Visualizing the Scikit-Learn Model Selection Process
Reviewers: @mnarayan
Similarity score: 0.7650
Feature-engine: A Python package for feature engineering for machine learning
Reviewers: @Jose-Augusto-C-M, @papachristoumarios, @bobturneruk
Similarity score: 0.7590
pysr3: A Python Package for Sparse Relaxed Regularized Regression
Reviewers: @blakeaw, @mhu48
Similarity score: 0.7583
Sensie: Probing the sensitivity of neural networks
Reviewers: @ejhigson, @omshinde
Similarity score: 0.7542
High-performance neural population dynamics modeling enabled by scalable computational infrastructure
Reviewers: @richford, @tachukao
Similarity score: 0.7530
⚠️ Note to editors: If these papers look like they might be a good match, click through to the review issue for that paper and invite one or more of the authors before before considering asking the reviewers of these papers to review again for JOSS.
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Apart from minor issues with documentation and examples, I have the following major concerns about this contribution:
I am open to discussion and hope the software paper can be improved to clearly state the motivation and effort.
References (non-exhaustive list)
Quality of writing
@arfon my full review is here.
Thanks for your reviews @hbaniecki and @aksholokhov. @enricgrau – please take a look at the feedback from both reviewers and share your responses here. Of particular focus should be a response to @hbaniecki's feedback here: https://github.com/openjournals/joss-reviews/issues/5873#issuecomment-1761368406
Thanks to @hbaniecki and @aksholokhov for the impeccable reviews. We've been working on all of your comments and concerns during all these days, and we hope to fulfil and respond to all the raised points sometime in the next couple of weeks. Thank you @arfon for your attention to this review. We are excited to show how much the article and documentation have improved once we finish with the corrections.
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@arfon We have modified the paper.md
and asked the editorialbot to re-generate the pdf but it is rendering the same old version. Can you help us with this? Shall we wait more time to re-generate? Thank you!
Edit: Could this be due to version change from v0.3.0
to v0.3.2
?
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot set v0.3.2 as version
I'm sorry @enricgrau, I'm afraid I can't do that. That's something only editors are allowed to do.
@arfon We have responded to @aksholokhov and @hbaniecki in issues https://github.com/pudu-py/pudu/issues/4 and https://github.com/pudu-py/pudu/issues/3. At the moment, we ask to check the preview from the new and revised paper.md
found in the repository here. The new pdf
should be generated from this same file so we hope this causes no problems in the review process. Thank you all again for your valuable time and have a great weekend! 😄
@arfon We have modified the paper.md and asked the editorialbot to re-generate the pdf but it is rendering the same old version. Can you help us with this? Shall we wait more time to re-generate? Thank you!
I think this might be happening as you now have two paper.md
files in the repository. @editorialbot will simply compile the first one it finds. Could you delete the one you do not want to be compiled?
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@arfon We have modified the paper.md and asked the editorialbot to re-generate the pdf but it is rendering the same old version. Can you help us with this? Shall we wait more time to re-generate? Thank you!
I think this might be happening as you now have two
paper.md
files in the repository. @editorialbot will simply compile the first one it finds. Could you delete the one you do not want to be compiled?
That did the trick. Thank you!
Hi, I believe authors did a good job at improving the software/paper. My remaining comments are minor (see https://github.com/pudu-py/pudu/issues/4#issuecomment-1794775137).
I can recommend acceptence of the {pudu} software paper to JOSS.
@arfon The authors addressed my feedback in full and I can recommend the acceptance of the {pudu} paper to JOSS as well.
@enricgrau – looks like we're very close to being done here. I will circle back here next week, but in the meantime, please give your own paper a final read to check for any potential typos etc.
After that, could you make a new release of this software that includes the changes that have resulted from this review. Then, please make an archive of the software in Zenodo/figshare/other service and update this thread with the DOI of the archive? For the Zenodo/figshare archive, please make sure that:
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@arfon I have made a final revision and created the Zenodo archive with the final version. I changed the title to match the paper and added all the authors. The DOI is 10.5281/zenodo.10161346
@arfon Just friendly reminder. Thank you!
@enricgrau – my apologies, somehow I lost track of this one!
@editorialbot set 10.5281/zenodo.10161346 as archive
Done! archive is now 10.5281/zenodo.10161346
@editorialbot recommend-accept
Attempting dry run of processing paper acceptance...
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1016/S0924-2031(03)00045-6 is OK
- 10.3390/analytica3030020 is OK
- 10.1038/srep19414 is OK
- 10.1201/9781003328513-9 is OK
- 10.1016/J.ECOENV.2022.114405 is OK
- 10.1002/9781119763406.CH8 is OK
- 10.1038/s41524-022-00884-7 is OK
- 10.1016/B978-1-55860-247-2.50037-1 is OK
- 10.1007/3-540-57868-4_57 is OK
- 10.1016/J.CJCA.2021.09.004 is OK
- 10.48550/arxiv.1602.04938 is OK
- 10.48550/arxiv.2010.07389 is OK
- 10.1613/JAIR.1.12228 is OK
- 10.1109/ACCESS.2020.2976199 is OK
- 10.1145/3351095.3375624 is OK
- 10.3389/FDATA.2021.688969 is OK
- 10.1109/iccv.2017.74 is OK
- 10.5281/ZENODO.6344451 is OK
- 10.1109/cvpr.2017.354 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1145/3313831.3376219 is OK
- 10.21105/JOSS.05220 is OK
- 10.1145/3351095.3375624 is OK
- 10.3389/FDATA.2021.688969 is OK
- 10.1002/AENM.202103163 is OK
- 10.1039/d1ta01299a is OK
- 10.5281/ZENODO.4743323 is OK
MISSING DOIs
- 10.21203/rs.3.rs-2963888/v1 may be a valid DOI for title: The Disagreement Problem in Explainable Machine Learning: A Practitioner’s Perspective
INVALID DOIs
- 10.1116/1.5140587/247679 is INVALID
- 10.1103/REVMODPHYS.79.353/FIGURES/62/MEDIUM is INVALID
:warning: Error preparing paper acceptance. The generated XML metadata file is invalid.
ID ref-Bhatt2020 already defined
ID ref-Belle2021 already defined
@enricgrau – could you check your references in your BibTeX file please? It looks like there are duplicate entries for Bhatt2020
and Belle2021
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@arfon No problem :) I fixed the doi's and also deleted de duplicate entries. Thank you!
@editorialbot recommend-accept
Attempting dry run of processing paper acceptance...
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1016/S0924-2031(03)00045-6 is OK
- 10.3390/analytica3030020 is OK
- 10.1116/1.5140587 is OK
- 10.1038/srep19414 is OK
- 10.1201/9781003328513-9 is OK
- 10.1016/J.ECOENV.2022.114405 is OK
- 10.1002/9781119763406.CH8 is OK
- 10.1103/REVMODPHYS.79.353 is OK
- 10.1038/s41524-022-00884-7 is OK
- 10.1016/B978-1-55860-247-2.50037-1 is OK
- 10.1007/3-540-57868-4_57 is OK
- 10.1016/J.CJCA.2021.09.004 is OK
- 10.48550/arxiv.1602.04938 is OK
- 10.48550/arxiv.2010.07389 is OK
- 10.1613/JAIR.1.12228 is OK
- 10.1109/ACCESS.2020.2976199 is OK
- 10.3389/FDATA.2021.688969 is OK
- 10.1109/iccv.2017.74 is OK
- 10.5281/ZENODO.6344451 is OK
- 10.1109/cvpr.2017.354 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1145/3313831.3376219 is OK
- 10.21105/JOSS.05220 is OK
- 10.1145/3351095.3375624 is OK
- 10.1002/AENM.202103163 is OK
- 10.1039/d1ta01299a is OK
- 10.5281/ZENODO.4743323 is OK
MISSING DOIs
- 10.21203/rs.3.rs-2963888/v1 may be a valid DOI for title: The Disagreement Problem in Explainable Machine Learning: A Practitioner’s Perspective
INVALID DOIs
- None
:wave: @openjournals/dsais-eics, this paper is ready to be accepted and published.
Check final proof :point_right::page_facing_up: Download article
If the paper PDF and the deposit XML files look good in https://github.com/openjournals/joss-papers/pull/4826, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept
Submitting author: !--author-handle-->@enricgrau<!--end-author-handle-- (Enric Grau-Luque) Repository: https://github.com/pudu-py/pudu Branch with paper.md (empty if default branch): main Version: 0.3.0 Editor: !--editor-->@arfon<!--end-editor-- Reviewers: @hbaniecki, @aksholokhov Archive: 10.5281/zenodo.10161346
Status
Status badge code:
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@hbaniecki & @aksholokhov, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @arfon know.
✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨
Checklists
📝 Checklist for @hbaniecki
📝 Checklist for @aksholokhov