openjournals / joss-reviews

Reviews for the Journal of Open Source Software
Creative Commons Zero v1.0 Universal
712 stars 38 forks source link

[REVIEW]: PyFstat: a Python package for continuous gravitational-wave data analysis #3000

Closed whedon closed 3 years ago

whedon commented 3 years ago

Submitting author: @dbkeitel (David Keitel) Repository: https://github.com/PyFstat/PyFstat/ Version: v1.11.5 Editor: @danielskatz Reviewer: @khanx169, @RobertRosca Archive: 10.5281/zenodo.4660591

:warning: JOSS reduced service mode :warning:

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/9a0336ed9451fb0852dfe662d9842355"><img src="https://joss.theoj.org/papers/9a0336ed9451fb0852dfe662d9842355/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/9a0336ed9451fb0852dfe662d9842355/status.svg)](https://joss.theoj.org/papers/9a0336ed9451fb0852dfe662d9842355)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@khanx169 & @RobertRosca, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @danielskatz know.

✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨

Review checklist for @khanx169

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

Review checklist for @RobertRosca

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

whedon commented 3 years ago

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @khanx169, @RobertRosca it looks like you're currently assigned to review this paper :tada:.

:warning: JOSS reduced service mode :warning:

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

:star: Important :star:

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf
whedon commented 3 years ago

PDF failed to compile for issue #3000 with the following error:

Can't find any papers to compile :-(

danielskatz commented 3 years ago

@whedon generate pdf from branch joss

whedon commented 3 years ago
Attempting PDF compilation from custom branch joss. Reticulating splines etc...
whedon commented 3 years ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

danielskatz commented 3 years ago

@khanx169 and @RobertRosca - Thanks for agreeing to review this submission.

This is the review thread for the paper. All of our communications will happen here from now on.

Both reviewers have checklists at the top of this thread with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.

Please read the first couple of comments in this issue carefully, so that you can accept the invitation from JOSS and be able to check items, and so that you don't get overwhelmed with notifications from other activities in JOSS.

The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews#3000 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.

We aim for reviews to be completed within about 2-4 weeks. Please let me know if any of you require some more time. We can also use Whedon (our bot) to set automatic reminders if you know you'll be away for a known period of time.

Please feel free to ping me (@danielskatz) if you have any questions/concerns.

danielskatz commented 3 years ago

@RobertRosca & @khanx169 - it's great to see that you have started your reviews - thanks!

dbkeitel commented 3 years ago

Thanks from my side too! There's apparently a numpy-lalsuite version incompatibility at the moment that breaks our automated tests, I'm looking into fixing it by a temporary version pinning.

whedon commented 3 years ago

:wave: @RobertRosca, please update us on how your review is going.

whedon commented 3 years ago

:wave: @khanx169, please update us on how your review is going.

danielskatz commented 3 years ago

👋 @khanx169, @RobertRosca - It looks like you all have started, which is great. I just want to check in and see how things are going. Are you blocked on anything?

khanx169 commented 3 years ago

👋 @khanx169, @RobertRosca - It looks like you all have started, which is great. I just want to check in and see how things are going. Are you blocked on anything?

@danielskatz - thanks for checking in. The last two weeks were incredibly busy which caused a little bit of slowdown. However, I plan to pick up the pace again tomorrow and hopefully finish by this weekend if everything goes smoothly.

RobertRosca commented 3 years ago

@danielskatz same on my side as well, been very but I plan on finishing this weekend or by Tuesday next week.

dbkeitel commented 3 years ago

Thanks @khanx169 @RobertRosca ! Let us know if you have any questions. We've had two minor releases in the meantime and the dependency issue mentioned above is resolved.

khanx169 commented 3 years ago

@danielskatz @dbkeitel I've gone through the review checklist and everything looks good to me. I've one minor suggestion: In the Github repo, there's a run_all_examples.py script. The docs point to GitHub, but it might be good to also explicitly mention this script there (especially since some examples need to be executed in order).

danielskatz commented 3 years ago

Thanks @khanx169! That sounds like a good suggestion.

dbkeitel commented 3 years ago

@khanx169 Thanks for the positive feedback!

I've gone through the review checklist and everything looks good to me. I've one minor suggestion: In the Github repo, there's a run_all_examples.py script. The docs point to GitHub, but it might be good to also explicitly mention this script there (especially since some examples need to be executed in order).

This script was so far mostly intended for our own use - we run it as a manually-triggered action before releases to make sure the examples don't break. (But not automatically, because it takes much longer to run than the regular integration tests.) But if you found it useful, we're of course happy do document it more clearly.

To make sure, where are you suggesting to mention it?

khanx169 commented 3 years ago

@khanx169 Thanks for the positive feedback!

I've gone through the review checklist and everything looks good to me. I've one minor suggestion: In the Github repo, there's a run_all_examples.py script. The docs point to GitHub, but it might be good to also explicitly mention this script there (especially since some examples need to be executed in order).

This script was so far mostly intended for our own use - we run it as a manually-triggered action before releases to make sure the examples don't break. (But not automatically, because it takes much longer to run than the regular integration tests.) But if you found it useful, we're of course happy do document it more clearly.

To make sure, where are you suggesting to mention it?

  • the README
  • readthedocs
  • both

@dbkeitel I was thinking specifically of the Examples section of readthedocs. I understand most users would probably want to look at some specific example at a time, but my thought process when I reached that section was: "It'd be nice if I could run all the examples first, and then come back and look at the scripts and outputs one by one to familiarize myself with the package". I suspect occasionally some odd user might think like this as well. Currently, the cumulative runtime for all examples is not terrible, but if more examples are added in the future, perhaps interactive notebooks (with the expected output for each cell already displayed) would be more useful.

I liked that the README is concise, but feel free to update both if you think that'd be more appropriate. Thanks.

RobertRosca commented 3 years ago

Hey, I also went through the checklist, everything looks good!

I have some suggestions but none of them are major:

The article mentions that one of the aims of PyFstat is to make this kind of analysis more accessible to new researchers and students, so I think converting the examples from python scripts to notebooks and then fleshing them out with more text and accompanying plots would be nice.

It's very easy to set up and run the examples yourself but I think that having the output saved and available in the docs makes a bit more beginner friendly. Reading through Asad Khan's suggestions we're thinking along the same lines :smile:

Overall this looks great! To be clear all the suggestions above are very minor, I'm happy with the current state of the submission and think it can be accepted now

danielskatz commented 3 years ago

Thanks @khanx169 and @RobertRosca - you both are great!

@RobertRosca - there are a lot of your checkboxes in the review unchecked. Can you check all those you are comfortable with, and if any need more work before you can check them (as opposed to suggestions for improvements that you would like but don't think are required), please let us know about that work too, though I don't think you are saying this.

RobertRosca commented 3 years ago

@danielskatz ah yes my bad, forgot to actually check them off as I went through everything, that's fixed now!

danielskatz commented 3 years ago

Thanks - so over to @dbkeitel now to make the optional changes and improvements ...

Once this is done, let us all know so we can check how things look and do a final signoff, then we can do the final acceptance and publishing

dbkeitel commented 3 years ago

Thanks a lot for the helpful suggestions! It's late here in Spain, so I'll respond in more detail tomorrow.

dbkeitel commented 3 years ago

Thanks again to both reviewers! I've now opened issues over on our repo to track the suggestions.

I think we can resolve @khanx169's suggestion about the all-examples script quickly.

For @RobertRosca's suggestions, the examples handling and codecov are something we had planned (but not yet opened issues for) anyway, but the given pointers will be very helpful to it. The suggestion about installation / requirements management also sounds very sensible. It may however take us some time to get around to resolving these three points.

RobertRosca commented 3 years ago

Awesome! If I have free time I may do some PRs for those issues myself, so to avoid duplicating work could you (or others working on the project) leave a quick comment on an issue saying that work has started on them already?

dbkeitel commented 3 years ago

Hi @RobertRosca that would be awesome and much appreciated! Generally, if there is no answer to one of the issues yet, nor a linked PR as in https://github.com/PyFstat/PyFstat/issues/287 , then you can safely assume we haven't started on it yet.

danielskatz commented 3 years ago

👋 @dbkeitel - this progress looks good, but also be aware that it's up to you if we move to publication now or wait for these optional improvements. I'm happy to wait for them, but we also could publish as is.

dbkeitel commented 3 years ago

@danielskatz I was wondering if there's a standard way to link to a package's ASCL entry in the left sidebar of the article, or if the easiest is to include it in the acknowledgments?

danielskatz commented 3 years ago

Could you use a reference? I don't think we can use the left column for this. Maybe @arfon will have another suggestion?

arfon commented 3 years ago

@danielskatz I was wondering if there's a standard way to link to a package's ASCL entry in the left sidebar of the article, or if the easiest is to include it in the acknowledgments?

I'm afraid we can't support custom content in the left-hand column but you're welcome to add it as an explicit callout in the text (i.e., an acknowledgment) or cite directly as @danielskatz suggests.

dbkeitel commented 3 years ago

Thanks both for checking, I'll just add it as suggested.

Also we'll be merging a few small infrastructure fixes/improvements soon (including codecov integration as suggested by @RobertRosca ), then make another release, and then would be ready for publication.

dbkeitel commented 3 years ago

@danielskatz : We've made a new v1.11.4 release of the package including the following behind-the-scenes improvements, including reviewer comments:

We'll continue improving documentation, examples and test coverage in the future, but for now we would be glad to proceed with publishing the paper. Two small final issues from our side, sorry for being difficult "customers" and thanks for the great experience so far:

  1. I noticed a small inconsistency in citation formats:

    • papers with medium-length, explicitly given author lists become et al. in the compiled pdf, for example one with bibtex entry like author = "Keitel, David and Prix, Reinhard and Papa, Maria Alessandra and Leaci, Paola and Siddiqi, Maham",.
    • but for collaboration papers where inspire only puts e.g. author = "Acernese, F. and others",, these are displayed as & others in the pdf. My (limited) understanding is that the bibtex+LaTeX combo normally knows that "and others" has to be translated to "et al.", but pandoc apparently doesn't. I tried a few things (just writing et al. in the bibtex author field -> disappears in the pdf; pasting full author lists (1000+ for some) -> works but compilation becomes extremely slow) but is there a recommended trick?
  2. We'll also need to do a final update of our funding acknowledgments before "freezing" the paper.

dbkeitel commented 3 years ago

@whedon generate pdf from branch joss

whedon commented 3 years ago
Attempting PDF compilation from custom branch joss. Reticulating splines etc...
whedon commented 3 years ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

danielskatz commented 3 years ago

I'll see if I can find an answer to 1, and then we can do 2. If we can't fix 1, I don't think it's a big problem though.

danielskatz commented 3 years ago

When we are ready, the next steps will be

arfon commented 3 years ago

My (limited) understanding is that the bibtex+LaTeX combo normally knows that "and others" has to be translated to "et al.", but pandoc apparently doesn't. I tried a few things (just writing et al. in the bibtex author field -> disappears in the pdf; pasting full author lists (1000+ for some) -> works but compilation becomes extremely slow) but is there a recommended trick?

@tarleb - any suggestions here?

dbkeitel commented 3 years ago

@whedon generate pdf from branch joss

whedon commented 3 years ago
Attempting PDF compilation from custom branch joss. Reticulating splines etc...
whedon commented 3 years ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

dbkeitel commented 3 years ago
dbkeitel commented 3 years ago

@danielskatz v1.11.5 of the package, with DOI 10.5281/zenodo.4659766, is ready as a reference version for the paper.

However, it doesn't have the longer title of the paper, and the order of the author list is different (original author first). Should I make a separate manual deposit off the joss branch with exactly matching metadata, or is this one fine?

dbkeitel commented 3 years ago

@whedon generate pdf from branch joss

whedon commented 3 years ago
Attempting PDF compilation from custom branch joss. Reticulating splines etc...
whedon commented 3 years ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

danielskatz commented 3 years ago

@dbkeitel - you should be able to go to the Zenodo page and edit the metadata

dbkeitel commented 3 years ago

@dbkeitel - you should be able to go to the Zenodo page and edit the metadata

Of course, but then it would look odd that this one version has different data than all others before and after. If JOSS needs one deposit that exactly matches the paper, then I'd prefer to do a "duplicate", if that's ok.

danielskatz commented 3 years ago

Of course, but then it would look odd that this one version has different data than all others before and after.

As Emerson said, "A foolish consistency is the hobgoblin of little minds" - but it's up to you :)

If JOSS needs one deposit that exactly matches the paper, then I'd prefer to do a "duplicate", if that's ok.

Sure, that's fine.

dbkeitel commented 3 years ago

@danielskatz Here's the reference deposit: https://doi.org/10.5281/zenodo.4660591