Closed whedon closed 4 years ago
Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @nicholebarry, @aureliocarnero it looks like you're currently assigned to review this paper :tada:.
:warning: JOSS reduced service mode :warning:
Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.
:star: Important :star:
If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿
To fix this do the following two things:
For a list of things I can do to help you, just type:
@whedon commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@whedon generate pdf
Reference check summary:
OK DOIs
- 10.1016/j.physrep.2006.08.002 is OK
- 10.1088/0034-4885/75/8/086901 is OK
- 10.1111/j.1365-2966.2012.21293.x is OK
- 10.1093/mnras/stx2539 is OK
- 10.1093/mnras/stt1341 is OK
- 10.1093/mnras/stv2679 is OK
- 10.1093/mnras/stx649 is OK
- 10.1093/mnras/stz1220 is OK
- 10.1093/mnras/stz2224 is OK
- 10.1088/1475-7516/2019/02/058 is OK
- 10.1093/mnras/sty1786 is OK
- 10.1093/mnras/stv571 is OK
- 10.1007/s10686-013-9334-5 is OK
- 10.1051/0004-6361/201220873 is OK
- 10.1109/JPROC.2009.2017564 is OK
MISSING DOIs
- None
INVALID DOIs
- None
👋🏼 @sambit-giri @nicholebarry @aureliocarnero this is the review thread for the paper. All of our communications will happen here from now on.
Both reviewers have checklists at the top of this thread with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.
The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews#2363
so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.
We are currently operating in reduced service mode and aim for reviews to be completed within about 6 weeks. Please let me know if any of you require some more time. We can also use Whedon (our bot) to set automatic reminders if you know you'll be away for a known period of time.
Please feel free to ping me (@pibion) if you have any questions/concerns.
@whedon add @ziotom78 as reviewer
OK, @ziotom78 is now a reviewer
@aureliocarnero apologies, I needed to edit the checklist to add an additional reviewer and managed to delete the checks you've put so far.
Could you take a look and re-check?
I won't need to edit this issue again so they will stay this time!
@pibion sure no problem
@aureliocarnero thank you, and apologies again!
Dear @sambit-giri, while reviewing this paper, one thing we have to check is if the author list seems correct. From the people that has contributed to the git code and the authors in the paper, seems reasonable, but then, when you go to: https://tools21cm.readthedocs.io/authors.html you can see Hannes Jensen as an author as well. Should not be an author as well in the paper? Or is it maybe not an author anymore? Nonetheless, if he/she had contribution in the past, you should consider putting the name as author in the manuscript. Cheers
Dear @aureliocarnero, Hannes Jensen had a package named c2raytools (https://github.com/hjens/c2raytools). It was not maintained after he left academia. We modified that package and merged it into tools21cm. We had contacted Hannes to be a co-author. But he does not have a proper academic affiliation/address to be on the paper. We were not sure how JOSS can handle this. If this is a problem, Hannes also is fine being acknowledged in the paper. What do you suggest?
@pibion Can you add the second author (https://github.com/garrelt) to this review process?
@sambit-giri I believe @garrelt can join this review process by subscribing to this thread.
@sambit-giri I'll look into this and get back to you shortly!
Hannes Jensen had a package named c2raytools (https://github.com/hjens/c2raytools). It was not maintained after he left academia. We modified that package and merged it into tools21cm. We had contacted Hannes to be a co-author. But he does not have a proper academic affiliation/address to be on the paper. We were not sure how JOSS can handle this. If this is a problem, Hannes also is fine being acknowledged in the paper. What do you suggest?
@sambit-giri, @aureliocarnero on the authorship issue: JOSS does need an affiliation for every author, but this is only required to make the latex template happy. So for someone without an academic affiliation, they can put whatever seems most appropriate. For example,
authors:
- name: Jane Doe
affiliation: 1
affiliations:
- name: None
index: 1
Would show "None" as the affiliation.
The author can put whatever seems most appropriate here, though. In addition to the "none" option, another suggestion is"Self."
Dear @sambit-giri please see message above. I would recommend to include Hannes as author to the paper if he agrees. I have some comments on the references in the paper draft. Since you reference several papers of yourself from 2019, could you please put 2019a, 2019b... etcerera? The same as you do with your 2018 papers. Also, there is one reference: Ross et al., 2017, that at some point you reference as: Ross, Dixon, Iliev, & Mellema, 2017.
Also, I'm not sure when you should use et al, or the list of names as a function of the numbers of authors. I think is 3, if there are only 3 authors, you can place the three names, but if they are more than 3, then you should put et al. Homogenize it through the text, please. @pibion , could you please confirm this issue? When we should use et al, and when the full list of authors?
@sambit-giri also, one of the checks I need to do related to the paper is: State of the field: Do the authors describe how this software compares to other commonly-used packages?
You actually say nothing about the state of the field, are there any other software that compares? In these terms, I think is important you name in the draft what you commented above: "We modified c2raytools (https://github.com/hjens/c2raytools) package and merged it into tools21cm. ", referencing the code appropietly. Cheers Aurelio
@aureliocarnero I'm taking the day off today but I'll get back to you on the reference style Monday!
@sambit-giri, does the following provide sufficient information for you to add your additional collaborator as an author?
@sambit-giri, @aureliocarnero on the authorship issue: JOSS does need an affiliation for every author, but this is only required to make the latex template happy. So for someone without an academic affiliation, they can put whatever seems most appropriate. For example,
authors: - name: Jane Doe affiliation: 1 affiliations: - name: None index: 1
Would show "None" as the affiliation.
The author can put whatever seems most appropriate here, though. In addition to the "none" option, another suggestion is"Self."
@nicholebarry, @ziotom78 this is a reminder that we're asking our reviewers to aim to complete their checklist in about six weeks. If you have an idea of what your time frame is for getting started, you can let me know and I can set a reminder with whedon
.
@whedon generate pdf
PDF failed to compile for issue #2363 with the following error:
/app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/author.rb:72:in block in build_affiliation_string': Problem with affiliations for Hannes Jensen, perhaps the affiliations index need quoting? (RuntimeError) from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/author.rb:71:in
each'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/author.rb:71:in build_affiliation_string' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/author.rb:17:in
initialize'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:201:in new' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:201:in
block in parse_authors'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:198:in each' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:198:in
parse_authors'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:91:in initialize' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/processor.rb:36:in
new'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/processor.rb:36:in set_paper' from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/bin/whedon:55:in
prepare'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/command.rb:27:in run' from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/invocation.rb:126:in
invoke_command'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor.rb:387:in dispatch' from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/base.rb:466:in
start'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/bin/whedon:119:in <top (required)>' from /app/vendor/bundle/ruby/2.4.0/bin/whedon:23:in
load'
from /app/vendor/bundle/ruby/2.4.0/bin/whedon:23:in `
@whedon generate pdf
Dear @sambit-giri , Im starting to test the code. One important aspect is that you need to provide automated tests with your code, but I cannot find those or how to run. According to the checklist, I need to verify that: "Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?" I cannot verify this at the moment
Dear @sambit-giri, I have a suggestion about the installation instructions. Going to the github page here: https://github.com/sambit-giri/tools21cm In installation, I think you should name first that the user should clone the repository, I know this is obvious, but it is helpful. Therefore, I suggest that you divide the installation options in 2: first, by cloning the repositoy and then running python setup install, or directly with pip. They are 2 separate ways of doing it and it is not very clear.
@sambit-giri another question about installation. It seems like it is installed properly with the instructions you give, but I have a question. When I install it with python setup install, it appears a warning message as follows: UserWarning: Unknown distribution option: 'install_requires'. Im using python3, do you know if this can cause a problem to other users? Please verify the meaning of this warning. Installing it with pip works perfectly, though
Dear @aureliocarnero
Although individual researchers have undoubtedly developed their own tools to perform some of the calculations included in tools21cm, we are not aware of any open source package similar to tools21cm. We did search both with google and on github, but did not find anything.
c2raytools was a set of python routines developed by Hannes Jensen and me to analyse reionization simulations performed with the C2-Ray radiative transfer code. This set of routines is no longer maintained and never had a user base beyond our nearest collaborators. It formed the basis of the tools21cm package, which is however more general, hence the name change.
I hope this answers your query.
Best wishes, @garrelt
@sambit-giri also, one of the checks I need to do related to the paper is: State of the field: Do the authors describe how this software compares to other commonly-used packages?
You actually say nothing about the state of the field, are there any other software that compares? In these terms, I think is important you name in the draft what you commented above: "We modified c2raytools (https://github.com/hjens/c2raytools) package and merged it into tools21cm. ", referencing the code appropietly. Cheers Aurelio
Dear @garrelt thank you for the clarification. No need to put anything in the text, then. Cheers Aurelio
Dear @aureliocarnero In order to test the code, some simulation output is required. We have now provided some data at https://ttt.astro.su.se/~gmell/244Mpc/TestData_tools21cm/ The link contains data along with a README explaining it.
You can test the code on this data by running the tutorial given at https://tools21cm.readthedocs.io/examples/tutorials.html
Dear @sambit-giri , Im starting to test the code. One important aspect is that you need to provide automated tests with your code, but I cannot find those or how to run. According to the checklist, I need to verify that: "Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?" I cannot verify this at the moment
@aureliocarnero We have updated the README mentioning about cloning the repository.
Dear @sambit-giri, I have a suggestion about the installation instructions. Going to the github page here: https://github.com/sambit-giri/tools21cm In installation, I think you should name first that the user should clone the repository, I know this is obvious, but it is helpful. Therefore, I suggest that you divide the installation options in 2: first, by cloning the repositoy and then running python setup install, or directly with pip. They are 2 separate ways of doing it and it is not very clear.
@sambit-giri,
The instrument papers for the MWA are Tingay et al. 2013 and Wayth et al. 2018 (Phase I and Phase II respectively). I suggest you cite those instead of Lonsdale et al. 2009.
@whedon generate pdf
@whedon generate pdf
@sambit-giri,
The instrument papers for the MWA are Tingay et al. 2013 and Wayth et al. 2018 (Phase I and Phase II respectively). I suggest you cite those instead of Lonsdale et al. 2009.
@nicholebarry We have updated it.
Dear @sambit-giri , I cannot find any reference in the github (or in the documentation) to:
Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
So far I cannot evaluate this. I recommend you put this guidelines in the github repo README and possibly in the readthedocs. To give you an example, you can see this other software I reviewed in the past, the section "Contributing": https://github.com/noraeisner/LATTE#contributing
Dear @sambit-giri , I cannot find any reference in the github (or in the documentation) to:
Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
So far I cannot evaluate this. I recommend you put this guidelines in the github repo README and possibly in the readthedocs. To give you an example, you can see this other software I reviewed in the past, the section "Contributing": https://github.com/noraeisner/LATTE#contributing
@aureliocarnero Thank you for the recommendation. We have added a brief text about "contributing" in README.md and added the directed users to the detailed description in the documentation: https://tools21cm.readthedocs.io/contributing.html
@nicholebarry, @ziotom78 this is a reminder that we're asking our reviewers to aim to complete their checklist in about six weeks. If you have an idea of what your time frame is for getting started, you can let me know and I can set a reminder with
whedon
.
Hi @pibion, thanks for the message! Sorry for the slow reply, I was away in vacation and just came back. I already started the review and expect to finish it in a few days. It's on the top of my to-do list, so I do not need a reminder. Thank you!
In order to test the code, some simulation output is required. We have now provided some data at https://ttt.astro.su.se/~gmell/244Mpc/TestData_tools21cm/ The link contains data along with a README explaining it.
Hi @sambit-giri , I have downloaded the data and checked that the plots produced by the tutorial match mines. They effectively look the same, but I interpret JOSS' requirement for "automated tests" in a stricter sense: it's the computer that should take care of checking the consistency of the results with their expected outcome. This ensures that patches from potential contributors do not introduce systematic effects or bugs in your codebase.
I realize that it's not feasible to include all the test data in the GitHub repository, as they are really too big; for me your solution of providing them in a separate URL is perfect. Perhaps you could produce smaller data files by sampling a tinier space? It's not required that the results of a tests are cosmologically interesting, they just need to check that modifications to the code do not introduce systematic effects. You could keep the tutorial as it is, relying on the bigger data files (the tutorial must be cosmologically interesting, after all!), and include the smaller files in the GitHub repository only for the automated tests.
Also, you could add unit tests to several functions, like t2c.statistics.skewness
, without the need to load data from files. Example:
# Put this in file test/test_statistics.py
from tools21cm.statistics import skewness
import numpy as np
# Test the functionality of "skewness"
assert np.allclose([
skewness([1.0, 2.0, 0.0]),
skewness([1.0, 2.0, 4.0],
)], [
0.0,
0.20782656212951636,
])
This kind of tests is very useful for documentation as well.
Dear @sambit-giri I had an unexpected error running the tutorials in https://tools21cm.readthedocs.io/examples/tutorials.html#Bubble-size-distribution
When I execute cell [22]: r_spa, dn_spa = t2c.spa(xHII, boxsize=boxsize, nscales=20) It tells me the following error:
Traceback (most recent call last):
File "
Basically, xrange is not defined... Im running in python 3. Cheers Aurelio
Dear @sambit-giri I had an unexpected error running the tutorials in https://tools21cm.readthedocs.io/examples/tutorials.html#Bubble-size-distribution
When I execute cell [22]: r_spa, dn_spa = t2c.spa(xHII, boxsize=boxsize, nscales=20) It tells me the following error:
Traceback (most recent call last): File "", line 1, in File "/home/acarnero/codes/anaconda3/lib/python3.7/site-packages/tools21cm/bubble_stats.py", line 181, in spa rs, ni = spa_np.spa_np(data, xth=xth, binning=binning, nscales=nscales) File "/home/acarnero/codes/anaconda3/lib/python3.7/site-packages/tools21cm/spa_np.py", line 15, in spa_np for i in xrange(nscales): NameError: name 'xrange' is not defined
Basically, xrange is not defined... Im running in python 3. Cheers Aurelio
Dear @aureliocarnero I had fixed this issue last week. You might have installed the package before this. Please try to re-install it and tell me if it doesn't work still. Best, Sambit
In order to test the code, some simulation output is required. We have now provided some data at https://ttt.astro.su.se/~gmell/244Mpc/TestData_tools21cm/ The link contains data along with a README explaining it.
Hi @sambit-giri , I have downloaded the data and checked that the plots produced by the tutorial match mines. They effectively look the same, but I interpret JOSS' requirement for "automated tests" in a stricter sense: it's the computer that should take care of checking the consistency of the results with their expected outcome. This ensures that patches from potential contributors do not introduce systematic effects or bugs in your codebase.
I realize that it's not feasible to include all the test data in the GitHub repository, as they are really too big; for me your solution of providing them in a separate URL is perfect. Perhaps you could produce smaller data files by sampling a tinier space? It's not required that the results of a tests are cosmologically interesting, they just need to check that modifications to the code do not introduce systematic effects. You could keep the tutorial as it is, relying on the bigger data files (the tutorial must be cosmologically interesting, after all!), and include the smaller files in the GitHub repository only for the automated tests.
Also, you could add unit tests to several functions, like
t2c.statistics.skewness
, without the need to load data from files. Example:# Put this in file test/test_statistics.py from tools21cm.statistics import skewness import numpy as np # Test the functionality of "skewness" assert np.allclose([ skewness([1.0, 2.0, 0.0]), skewness([1.0, 2.0, 4.0], )], [ 0.0, 0.20782656212951636, ])
This kind of tests is very useful for documentation as well.
Thanks @ziotom78 I will get back to you on this soon.
Dear @sambit-giri you are right, now it worked perfectly. So, for me, it is only missing the unit test issue commented above. Once you have those available, please let me know, so I can test those and then, validate all the points from the review. Cheers Aurelio
@sambit-giri I agree with @aureliocarnero, providing a separate URL for a data download is an excellent solution for larger data sets.
I would recommend putting this data on a service like Zenodo or Figshare rather than hosting it from your own website.
Dear @ziotom78 and @aureliocarnero
I have put scripts for unit testing at https://github.com/sambit-giri/tools21cm/tree/master/tests I have not included the functions that need data such as the read/write functions and redshift space distortion implementation.
Please let me know if anything else is needed.
Best, Sambit
Dear @sambit-giri , this is great, but how can I execute the unity tests? You should explain this in the documentation... Cheers
Dear @sambit-giri , this is great, but how can I execute the unity tests? You should explain this in the documentation... Cheers
@aureliocarnero I have put a README inside the same folder.
@sambit-giri I agree with @aureliocarnero, providing a separate URL for a data download is an excellent solution for larger data sets.
I would recommend putting this data on a service like Zenodo or Figshare rather than hosting it from your own website.
Dear @pibion
We have put the data up on Zenodo (https://doi.org/10.5281/zenodo.3953639). The link to the data has also been updated in the tutorial (https://tools21cm.readthedocs.io/examples/tutorials.html).
Best, Sambit
Submitting author: @sambit-giri (Sambit Kumar Giri) Repository: https://github.com/sambit-giri/tools21cm Version: v2.1 Editor: @pibion Reviewers: @nicholebarry, @aureliocarnero, @ziotom78 Archive: 10.5281/zenodo.3973542
:warning: JOSS reduced service mode :warning:
Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.
Status
Status badge code:
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@nicholebarry & @aureliocarnero & @ziotom78 , please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @pibion know.
✨ Please try and complete your review in the next six weeks ✨
Review checklist for @nicholebarry
Conflict of interest
Code of Conduct
General checks
Functionality
Documentation
Software paper
Review checklist for @ziotom78
Conflict of interest
Code of Conduct
General checks
Functionality
Documentation
Software paper
Review checklist for @aureliocarnero
Conflict of interest
Code of Conduct
General checks
Functionality
Documentation
Software paper