Closed editorialbot closed 6 months ago
Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.
For a list of things I can do to help you, just type:
@editorialbot commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@editorialbot generate pdf
Software report:
github.com/AlDanial/cloc v 1.88 T=0.13 s (889.4 files/s, 155588.0 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
Python 71 2222 2170 6117
Jupyter Notebook 16 0 5074 1628
Cython 6 249 204 656
Markdown 9 150 0 515
TeX 1 33 0 391
YAML 6 35 47 170
make 2 11 8 38
Bourne Shell 1 14 14 16
TOML 1 0 0 5
-------------------------------------------------------------------------------
SUM: 113 2714 7517 9536
-------------------------------------------------------------------------------
gitinspector failed to run statistical information for the repository
Wordcount for paper.md
is 1147
@editorialbot generate pdf
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1051/0004-6361/201322068 is OK
- 10.5281/zenodo.21703 is OK
- 10.1088/0067-0049/192/1/9 is OK
- 10.1145/2929908.2929916 is OK
- 10.21105/joss.02430 is OK
- 10.1145/2833157.2833162 is OK
- 10.48550/arXiv.2307.06345 is OK
- 10.1146/annurev-astro-082812-140951 is OK
- 10.1111/j.1365-2966.2009.15715.x is OK
- 10.1088/0067-0049/192/1/9 is OK
- 10.1109/MCSE.2011.37 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1093/mnras/stt428 is OK
- 10.1038/nature03597 is OK
- 10.3847/1538-4365/ab908c is OK
- 10.1109/MCSE.2010.118 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.25080/Majora-4af1f417-011 is OK
- 10.1109/MCSE.2007.53 is OK
- 10.1038/s42254-019-0127-2 is OK
MISSING DOIs
- 10.1093/mnras/stac1882 may be a valid DOI for title: Hydromagnetic waves in an expanding universe - cosmological MHD code tests using analytic solutions
INVALID DOIs
- None
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Hi @tberlok, it looks like there is still an issue with a missing DOI for the paper. Also, it may be worth me mentioning that the latest information on how to cite the SWIFT code is available here: https://swift.strw.leidenuniv.nl/docs/CitingSWIFT/index.html.
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Hi @ttricco and @kyleaoman, I hope the review is progressing nicely. Please, if you can, try to wrap up the review in the next couple of weeks; let me know if you have any questions!
I've completed most of my review and have left a number of github issues open for the authors. My checklist above reflects which areas I feel need a little bit of work (but overall the package is in a good state). The exception is the "Performance" item - I should still try to check the functionality of the GPU-accelerated features. My laptop doesn't have a suitable GPU, but I should be able to test this on our local compute cluster. Will try to do this asap.
Alright added one last issue around the GPU interactive features. My review is complete, so I leave it with the authors to address the github issues that I've opened to their satisfaction. Happy to iterate on things as needed, just let me know if you (authors/editors) need anything from me in the meantime.
Hi @kyleaoman, thank you so much! Is it possible for you to reference this review thread in those issues, or do that here, so they are linked together automatically by GitHub? This way we can keep track of progress.
https://github.com/tberlok/paicos/issues/47 https://github.com/tberlok/paicos/issues/48 https://github.com/tberlok/paicos/issues/49 https://github.com/tberlok/paicos/issues/50 https://github.com/tberlok/paicos/issues/51 https://github.com/tberlok/paicos/issues/52 https://github.com/tberlok/paicos/issues/53 https://github.com/tberlok/paicos/issues/54 https://github.com/tberlok/paicos/issues/55 https://github.com/tberlok/paicos/issues/56
@JBorrow that works I think?
Yes, thank you!
Just to update on my progress, I have worked through most of the review items, and tested the non-GPU code (various ways to install, unit tests, core functionality, etc). The biggest outstanding items for me are around testing the functionality and performance of the GPU versions of the code, which I'll get to in the next couple of days.
Fantastic, thanks both. @tberlok, do you have any estimate of on what timescale you will be able to address the reviewer's comments?
I have had a look at the reviewers comments, which I will start to address in detail tomorrow. I expect/hope to be done at some point next week.
Hi @kyleaoman, @ttricco, @JBorrow
First of all many thanks to Kyle and Terrence for the detailed comments on Paicos! This was just the sort of feedback that I was hoping to get from submitting to JOSS. I believe I have now addressed all the issues that you raised on Github, but please let me know if you agree, so that I can close them.
A remaining issue is that you both wanted to test the GPU functionality. @kyleaoman ran into the problem that setting up a notebook + GPU environment is more difficult than simply running a Python script from terminal. I have for this reason created a Python script, https://github.com/tberlok/paicos/blob/main/examples/gpu_sph_speed_test_example.py, which I hope will be more straightforward to run. This also now includes instructions for downloading a large data set, without which you can't test the speed of the implementation (as the GPU becomes under-utilized for small data sets).
When I ran this script myself, I found that the GPU SPH code is no longer as fast as it used to be. Using the A100, it now takes around 0.75 seconds to deposit 93 mio. particles onto a 4096^2 grid. I think I will change the wording in the paper from "it takes 0.15 seconds to project 93 mio. particles onto an image plane with $2048^2$ pixels." to be "it takes less than a second to project 100 mio. particles onto an image plane with $4096^2$ pixels. I believe it's possible to make this faster and will probably try to optimise at some point, but for now I would like to go ahead with this version of the code.
Best, Thomas
Thanks all! Looks like this is coming along nicely.
I will be on vacation until the 24th, so please do not expect responses before then.
Hi folks, just checking in to see if there is anything left to do? @kyleaoman and @ttricco were you able to verify the requested changes?
Thanks for the check-in @JBorrow. @kyleaoman, @ttricco : I am looking forward to hearing back from you.
Hi all,
I finally managed to try out the interactive GPU visualization widget, works fine in a jupyter notebook (not lab). I'm satisfied with the changes to the package at this point and have ticked off the rest of my checklist.
@tberlok one remaining comment that I don't think should block publication. Installation is still a bit of a tricky process. Today I discovered that it seems like gcc (including recent versions) complains about some of the warning flags and then exits, eventually got it to work with an intel compiler.
Many thanks @kyleaoman!
@tberlok one remaining comment that I don't think should block publication. Installation is still a bit of a tricky process. Today I discovered that it seems like gcc (including recent versions) complains about some of the warning flags and then exits, eventually got it to work with an intel compiler.
Thanks for letting me know of this, I should definitely have a closer look. It compiles without issues on MacOS with gcc-13 from Homebrew. If you still have the error message somewhere, then it would be great if you could create an issue. If not, no worries!
Hi @JBorrow , are we sure that @ttricco has Github notifications enabled? Perhaps you can send him a direct message to make sure? Thanks.
@JBorrow , @ttricco As I understand, @kyleaoman has given green light for submission and @ttricco has also made a tick in 19 out of 23 boxes. @ttricco indicated further up that he just needs to test the GPU code, which @kyleaoman has already successfully tested. Please let me know the time line for the remainder of the review process.
Hi @dfm, sorry for pinging you but it appears that the review of my paper has gone stale. That is, I have not heard from @ttricco since Feb 28 and it's been three weeks since @JBorrow sent the last reminder. As I understand from the JOSS guidelines, editors will normally ask reviewers to complete their review in 4-6 weeks and check in on this 1-2 times per week. So I am worried about what is going on and if my reminders above have been read. It will also soon be three months since submission (which I understand is the target duration for a JOSS review) and I am finding the silence and uncertainty about time scales difficult. Can you help me get this back on track and/or assure me that my submission has not been forgotten? Thanks!
Hi @tberlok, apologies for my lack of communication here. I have been trying to communicate with the reviewers off-GitHub and have not received a reply; I did this last week and I am waiting until the end of this week before we move on with an alternative strategy. Again, sorry that this was not communicated to you directly, but please rest assured that we are moving things forward in the background.
Hi @JBorrow, thank you very much for this update, that is reassuring to hear.
@JBorrow I've managed this week to circle back around to finish my review. I've finished testing the software on two different linux environments / hardware and am satisfied that everything works as described.
Thanks for your patience @tberlok and apologies for the delay on finalizing my review. The end of the teaching term was more overwhelming than I expected.
Fantastic, thank you @ttricco!
@kyleaoman, there is one outstanding item in your checklist that I believe has been addressed. Once you indicate that you are fully happy, we can move on to the post-review stage.
@JBorrow ah the license one? Yes that's been addressed, box now ticked.
@editorialbot set <DOI here> as archive
@editorialbot set <version here> as version
@editorialbot generate pdf
@editorialbot check references
and ask author(s) to update as needed@editorialbot recommend-accept
@tberlok we're now happy with the package, thank you for your patience!
At this point could you:
I can then move forward with recommending acceptance of the submission.
One final comment on the paper: you have used i.e.
on line 26, where I think you may mean e.g.
.
Wow, that went very fast all of a sudden! Thanks everyone, for reviewing and editing the code and paper!
I have created a Zenodo archive here: 10.5281/zenodo.10994256 Here I have uploaded a tagged version of Paicos, with tag v0.1.14, https://github.com/tberlok/paicos/releases/tag/v0.1.14
I have checked the Zenodo metadata and added all our ORCIDs.
One last thing from my side today, I have just added "We thank the JOSS referees, Kyle Oman and Terrence Tricco, and the editor, Josh Borrow, for constructive feedback that greatly helped improve Paicos." to the paper.
One final comment on the paper: you have used
i.e.
on line 26, where I think you may meane.g.
.
@JBorrow Also thanks for pointing this out, I have made the correction. As I understand, this means that I have no more tasks left, but do let me know if you need anything else from me or if I need to make other final corrections.
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot check references
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1051/0004-6361/201322068 is OK
- 10.5281/zenodo.21703 is OK
- 10.48550/arXiv.2305.13380 is OK
- 10.1088/0067-0049/192/1/9 is OK
- 10.1145/2929908.2929916 is OK
- 10.21105/joss.02430 is OK
- 10.1145/2833157.2833162 is OK
- 10.48550/arXiv.2307.06345 is OK
- 10.1146/annurev-astro-082812-140951 is OK
- 10.1111/j.1365-2966.2009.15715.x is OK
- 10.1088/0067-0049/192/1/9 is OK
- 10.1109/MCSE.2011.37 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1093/mnras/stt428 is OK
- 10.1038/nature03597 is OK
- 10.3847/1538-4365/ab908c is OK
- 10.1109/MCSE.2010.118 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.25080/Majora-4af1f417-011 is OK
- 10.1109/MCSE.2007.53 is OK
- 10.1038/s42254-019-0127-2 is OK
- 10.1093/mnras/stac1882 is OK
MISSING DOIs
- No DOI given, and none found for title: CuPy: A NumPy-Compatible Library for NVIDIA GPU Ca...
- No DOI given, and none found for title: Maximizing Parallelism in the Construction of BVHs...
- No DOI given, and none found for title: Hierarchical Data Format, version 10
- No DOI given, and none found for title: Python and HDF5
- No DOI given, and none found for title: pytest 7.4
- No DOI given, and none found for title: Jupyter Notebooks ? a publishing format for reprod...
INVALID DOIs
- None
There are a couple of missing DOIs in the references, though I have looked and it seems that indeed you follow their requested citation strategy.
The only odd thing there @tberlok is the question mark in the Jupyter Notebooks title.
@editorialbot set 10.5281/zenodo.10994256 as archive
That doesn't look like a valid DOI value
Submitting author: !--author-handle-->@tberlok<!--end-author-handle-- (Thomas Berlok) Repository: https://github.com/tberlok/paicos Branch with paper.md (empty if default branch): paper Version: v0.1.14 Editor: !--editor-->@JBorrow<!--end-editor-- Reviewers: @ttricco, @kyleaoman Archive: 10.5281/zenodo.10994256
Status
Status badge code:
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@ttricco & @kyleaoman, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @JBorrow know.
✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨
Checklists
📝 Checklist for @ttricco
📝 Checklist for @kyleaoman