Closed editorialbot closed 1 month ago
Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.
For a list of things I can do to help you, just type:
@editorialbot commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@editorialbot generate pdf
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.21105/joss.05340 is OK
- 10.48550/arXiv.2402.08725 is OK
- 10.1051/0004-6361/201629272 is OK
- 10.3847/25c2cfeb.d8909f28 is OK
- 10.1051/0004-6361/201322068 is OK
- 10.1088/0004-6256/139/5/1782 is OK
- 10.5281/ZENODO.1482018 is OK
- 10.1086/323894 is OK
- 10.21105/joss.00058 is OK
- 10.1051/aas:1996164 is OK
- 10.3847/1538-3881/aafc33 is OK
- 10.3847/1538-4357/ac7c74 is OK
- 10.1086/305187 is OK
- 10.1093/mnras/sty911 is OK
- 10.5281/ZENODO.5597138 is OK
MISSING DOIs
- No DOI given, and none found for title: legacypipe: Image reduction pipeline for the DESI ...
- No DOI given, and none found for title: COSMOULINE
- No DOI given, and none found for title: HDF: The hierarchical data format
- No DOI given, and none found for title: PyEphem: Astronomical Ephemeris for Python
INVALID DOIs
- None
Software report:
github.com/AlDanial/cloc v 1.90 T=0.01 s (673.8 files/s, 71756.1 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
TeX 1 23 0 253
CSV 2 0 0 158
Markdown 1 20 0 94
Python 1 16 0 49
YAML 1 1 4 21
-------------------------------------------------------------------------------
SUM: 6 60 4 575
-------------------------------------------------------------------------------
Commit count by author:
119 Frédéric Dux
Paper file info:
📄 Wordcount for paper.md
is 1101
✅ The paper includes a Statement of need
section
License info:
🔴 Failed to discover a valid open source license
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@Onoddil, @robertfwilson thank you for agreeing to review this submission! We would like to get a review in ~4 weeks max, so the goal for the initial review by ~June 21. Please let me know if you encounter any issues, here or via email.
@editorialbot check repository
Software report:
github.com/AlDanial/cloc v 1.90 T=0.01 s (671.5 files/s, 77112.7 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
TeX 1 23 0 253
CSV 2 0 0 207
Markdown 1 20 0 94
Python 1 16 1 49
YAML 1 1 4 21
-------------------------------------------------------------------------------
SUM: 6 60 5 624
-------------------------------------------------------------------------------
Commit count by author:
121 Frédéric Dux
Paper file info:
📄 Wordcount for paper.md
is 1101
✅ The paper includes a Statement of need
section
License info:
🔴 Failed to discover a valid open source license
@duxfrederic the check above fails because the paper branch does not contain the license. Could you please make sure that the paper branch also has all the contents of the main branch?
@editorialbot check repository
@editorialbot generate pdf
Software report:
github.com/AlDanial/cloc v 1.90 T=0.01 s (673.4 files/s, 77329.4 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
TeX 1 23 0 253
CSV 2 0 0 207
Markdown 1 20 0 94
Python 1 16 1 49
YAML 1 1 4 21
-------------------------------------------------------------------------------
SUM: 6 60 5 624
-------------------------------------------------------------------------------
Commit count by author:
122 Frédéric Dux
Paper file info:
📄 Wordcount for paper.md
is 1101
✅ The paper includes a Statement of need
section
License info:
🟡 License found: GNU General Public License v3.0
(Check here for OSI approval)
Hi @ivastar Thanks for checking, ended up completely synchronizing the paper branch with what's in the main one at the moment.
@editorialbot check repository
Software report:
github.com/AlDanial/cloc v 1.90 T=0.08 s (871.0 files/s, 117024.8 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
Python 46 831 1436 3138
Markdown 6 148 0 597
SVG 2 1 23 415
YAML 7 88 177 275
TeX 2 23 1 268
Jupyter Notebook 1 0 1082 236
CSV 2 0 0 207
TOML 1 4 0 52
-------------------------------------------------------------------------------
SUM: 67 1095 2719 5188
-------------------------------------------------------------------------------
Commit count by author:
185 Frédéric Dux
Paper file info:
📄 Wordcount for paper.md
is 1101
✅ The paper includes a Statement of need
section
License info:
🟡 License found: GNU General Public License v3.0
(Check here for OSI approval)
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Hey @Onoddil, thanks for starting your work on this!
Regarding the Reproducibility check item, this is the directory structure produced by the pipeline to make the example result of shown in the paper. It includes:
regions.h5
)regions.h5
)prepared_cutouts/cutouts_195_PSJ0030-1525.h5
, together with a jupyter notebook producing the light curves and deconvolved image from these cutouts.They do not include the calibrated images themselves, which are quite voluminous (39 GB). If you feel like you need them to check the item, I'll find a way to send them over! However, from this, all the steps can be tested except from those involving the calibrated images themselves (importation, plate solving, cutout making)
Figuring it makes more sense to leave here, being JOSS paper-related rather than codebase-related (and hence split out into an issue over on the main repo), but just a few missing references that should ideally be included, if you have the room:
These can presumably just all go in a very dense sentence/paragraph of "this software makes use of X \cite{}, Y \cite{}, Z \cite{}, ..." in the appropriate place.
Thanks for the dataset @duxfrederic!
That all checked out, reproduces what I was able to reproduce (without needing to download 40gigs!), so apart from the references above I think that's it for me :)
Hey @Onoddil, thanks a lot for the expeditious review and useful suggestions!! I would never have caught the numpy 2.0 issue.
I've added a paragraph at the end with the additional references! Thanks for providing the bib entries in your comment above...
@editorialbot generate pdf
Glad it worked out without the 40 GB!! For anyone wanting to reproduce the example from scratch in the future, the data is incrementally falling in the public domain at the moment, and can be downloaded from the ESO archive. (object name in fits header: PSJ0030-1525, select WFI/LaSilla under imaging)
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
You've accidentally cited the 2022 astropy paper twice instead of the 2013 one, but apart from that this all LGTM :)
Oh, right, didn't see! I think the compiler doesn't like the :
character in bibtex keys. Changed it to _
and the citations are now correct. Thanks again!
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Hi @ivastar
apologies for bothering you about this, but am I allowed to gently ping our second reviewer about this review? :) I was hoping the code base would be published relatively soon so I could cite it in upcoming papers, which are put on hold at the moment
@duxfrederic I've just done that. I think they were not seeing the GitHub notifications, but should be done soon.
Hi @ivastar
thanks a bunch for contacting our second reviewer last month, have you had news from them? :)
@duxfrederic apologies for the delay! 😿 I am looking for a new reviewer. I'm open to suggestions.
@ivastar No worries at all, I know JOSS is run by volunteers with other things going on!
Regarding suggestions, going through the list of reviewers:
thanks!
@editorialbot remove @robertfwilson from reviewers
@robertfwilson removed from the reviewers list!
@editorialbot add @benjaminrose as reviewer
@benjaminrose added to the reviewers list!
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
I am getting an error from multiprocessing. I have tried python 3.12, 3.10, installing with pip and installing from source. I hit the same error on initialize_database()
and WorkflowManager().run()
. The full traceback is below. I am not sure what is happening or where exactly the problem is, but currently I am unable to test the functionality of lightcurver.
Hi @benjaminrose
Many thanks for starting this review so fast, and for trying different environments and python versions.
I just got access to a mac computer and I can reproduce your error! I think the automated tests should run smoothly on your end, but the error is due to the way mac and windows computers spawn processes.
With these, you would need to wrap any part of code that runs multiprocessing in a if __name__ == "__main__"
block.
I'll update the documentation to reflect this; in the mean time you have 3 possibilities:
if __name__ == "__main__"
block:
import os
# replace with your actual path:
os.environ['LIGHTCURVER_CONFIG'] = "/scratch/lightcurver_tutorial/config.yaml"
from lightcurver.structure.user_config import get_user_config from lightcurver.structure.database import initialize_database from lightcurver.pipeline.task_wrappers import read_convert_skysub_character_catalog
if name == "main": initialize_database() read_convert_skysub_character_catalog()
or to run everything
```python
import os
os.environ['LIGHTCURVER_CONFIG'] = "/path/to/config.yaml"
from lightcurver.pipeline.workflow_manager import WorkflowManager
if __name__ == "__main__":
wf_manager = WorkflowManager()
wf_manager.run()
lc_run /path/to/your/config.yaml
to run the entire pipeline.Sorry about this, I should have tested on mac computers more carefully!
Let me know if this fixes things on your side ...
Hi @benjaminrose
Did the above work? If it is more convenient for you in testing the steps individually, as in the tutorial, I put them together in the scripts attached here: lightcurver_scripts_for_review.zip
First:
export REVIEW_WORKDIR='/tmp/workdir'
export LIGHTCURVER_CONFIG='/tmp/workdir/config.yaml' # config has to be in workdir in this specific case
Then executing the scripts above one after the other worked on the mac I have at hand (M3 Pro, Sonoma 14.5)
But the simpler version is to just run lc_run config.yaml
.
Hi @benjaminrose
any news about the above? Hopefully it shouldn't take you more than 20 minutes to run the code and finish the review :pray: Apologies about for messaging again by the way, but things are getting tight here, schedule-wise ...
thanks!
edit: note that vizier is down at the moment, or has changed its database schema. I changed the default gaia provider to the gaia archive in the config file. just in case your tests stop running today because of an HTTP 500 error.
Submitting author: !--author-handle-->@duxfrederic<!--end-author-handle-- (Frédéric Dux) Repository: https://github.com/duxfrederic/lightcurver Branch with paper.md (empty if default branch): paper Version: 1.1.0 Editor: !--editor-->@ivastar<!--end-editor-- Reviewers: @Onoddil, @benjaminrose Archive: 10.5281/zenodo.13883045
Status
Status badge code:
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@Onoddil & @robertfwilson, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @ivastar know.
✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨
Checklists
📝 Checklist for @Onoddil
📝 Checklist for @benjaminrose