openjournals / joss-reviews

Reviews for the Journal of Open Source Software
Creative Commons Zero v1.0 Universal
708 stars 38 forks source link

[REVIEW]: pyprop8: A lightweight code to simulate seismic observables in a layered half-space #4217

Closed editorialbot closed 2 years ago

editorialbot commented 2 years ago

Submitting author: !--author-handle-->@valentineap<!--end-author-handle-- (Andrew Valentine) Repository: https://github.com/valentineap/pyprop8 Branch with paper.md (empty if default branch): Version: v1.1.2 Editor: !--editor-->@leouieda<!--end-editor-- Reviewers: @hemmelig, @hfmark Archive: 10.5281/zenodo.7019949

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/695768d664162b443c5dec5f7accb114"><img src="https://joss.theoj.org/papers/695768d664162b443c5dec5f7accb114/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/695768d664162b443c5dec5f7accb114/status.svg)](https://joss.theoj.org/papers/695768d664162b443c5dec5f7accb114)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@hemmelig & @hfmark, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @leouieda know.

✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨

Checklists

πŸ“ Checklist for @hfmark

πŸ“ Checklist for @hemmelig

editorialbot commented 2 years ago

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

:warning: JOSS reduced service mode :warning:

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf
editorialbot commented 2 years ago
Software report:

github.com/AlDanial/cloc v 1.88  T=0.10 s (237.1 files/s, 43027.8 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          11            288            650           2461
reStructuredText                 5            144             29            302
TeX                              1             13              0            108
Markdown                         2             23              0             59
DOS Batch                        1              8              1             26
YAML                             1              6              6             24
make                             1              4              7              9
TOML                             1              0              0              6
-------------------------------------------------------------------------------
SUM:                            23            486            693           2995
-------------------------------------------------------------------------------

gitinspector failed to run statistical information for the repository
editorialbot commented 2 years ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1111/j.1365-246X.2011.05210.x is OK
- 10.1111/j.1365-246X.2012.05608.x is OK
- 10.1029/2012GL054209 is OK
- 10.1093/gji/ggt473 is OK
- 10.3997/2214-4609.20142350 is OK
- 10.1785/0120150010 is OK
- 10.1093/gji/ggw108 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1038/s41592-019-0686-2 is OK

MISSING DOIs

- None

INVALID DOIs

- None
editorialbot commented 2 years ago

Wordcount for paper.md is 928

editorialbot commented 2 years ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

leouieda commented 2 years ago

:wave: @valentineap @hemmelig @hfmark welcome to the review thread for the paper! All of our communications will happen here from now on.

Before we start: Both reviewers need to post the comment @editorialbot generate my checklist in this thread to generate their review checklist.

The JOSS review is different from most other journals:

There are links to the JOSS reviewer guidelines at the top of this issue. Please feel free to ping me here (@leouieda) or email me privately if you have any questions/concerns.

valentineap commented 2 years ago

@hemmelig @hfmark Thank you both for agreeing to do this - don't hesitate to get in touch if you have any questions!

hfmark commented 2 years ago

Review checklist for @hfmark

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

hemmelig commented 2 years ago

Review checklist for @hemmelig

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

leouieda commented 2 years ago

πŸ‘‹πŸ½ Hi @valentineap @hemmelig @hfmark just wanted to check in and see how the review is going. It's been a bit quiet here but I see both reviewers are progressing on the checklists. Please don't hesitate to ask if you have any questions πŸ™‚

hemmelig commented 2 years ago

Hi @valentineap! At the top, I’d just like to see this looks like a really great tool and I have already been thinking about ways of using the package to build simple synthetic models for my own work. In particular, I am excited to try using this as a tool to help provide some objective measures of focal sphere coverage/site value for microseismic detection! It was also nice to be able to dive into some papers that wouldn’t necessarily come up in my day to day research. I’ve tried to balance my review around the code as code and the package and its applications as a whole. Within each section, I’ve aimed state the requests/questions concretely at the end.

Paper

The software paper is lucid and meets all the requirements set out in the high-level JOSS review checklist. My only comment is that, while I can envisage how one might use the package in an educational setting, I feel it might see far greater uptake if a more fleshed out example were provided, perhaps in the form of a Jupyter Notebook. These can be freely hosted online (for example, using MyBinder) and would bypass the need for any given user to install the package before they try it out. Equally, this may just be my relative naivety of the sorts of things one might use this for.

Is there any further information available regarding the application of prop8 to microseismic monitoring? The reference provided is a (rather short) abstract only, but it certainly sounds like an interesting application (particularly to me!).

Documentation

The documentation is easily accessible, contains clear instructions for installing pyprop8, and lots of helpful details about the various objects and their function within the package. Overall, having delved into the underlying code, I feel the documentation gives an accurate and encompassing view of the package. The important user-facing functions and objects are properly documented (i.e. all possible arguments and attributes are named and defined). The worked examples and annotated example are meticulous and thorough.

Per the checklist from JOSS, there aren't any community guidelines. While I can't envisage how the package might be extended, or even where any specific issues may arise (other than perhaps possible functional changes to the dependencies that impact pyprop8), it could be useful to have some statement addressing this.

Software examples

I ran the provided script for re-producing the figures presented in O’Toole & Woodhouse (2011) and O’Toole et al. (2012). Critically, these only run if the user installs matplotlib, which is not addressed anywhere in the documentation. Where possible, I tried to ensure I was comparing the figures at a similar scale/aspect ratio. By eye, the figures were largely identical. I note, however, some differences:

Also, it would be nice (from my perspective) to understand a bit better how to tune the frequency content of the synthetic seismograms, and produce synthetic traces containing P and S phase arrivals. While reviewing, I mainly limited myself to exploring the source code and the provided examples, but I will try make some time to experiment a bit more off the book with what pyprop8 can do!

Software tests

These should be a lot easier to run, and would be good to encourage the users to run them and provide an explanation. They also do not appear to work – specifically, the derivatives calculated for perturbations to phi are quite out. I note as well that the delta scaling is different for phi than for r or z.

Also, is there a reason why the setup for the test event is identical to the O’Toole & Woodhouse (2011) paper example, but at 20 km rather than the 34 km used in the paper?

While I don't envisage a huge amount of extension/expansion to the package that would necessitate comprehensive unit testing, it might be helpful to benchmark the outputs of pyprop8 against the less accessible prop8 numerically, in addition to the figure comparisons.

Software

Briefly – it is clear this is very much a transliteration of the FORTRAN 77 methods, which makes it rather dense reading at times. This is compounded by the fact that the code does not comply with the advised PEP8 style guide. Architecturally, I think it would certainly help to break things out into modules that achieve specific jobs within the package e.g. I would break the layered structure model out from core into a stand-alone module. Having everything within a single _core.py module did make it (personally!) a bit of a task to construct a mental map of the code.

While this doesn’t impact the functionality of the software, I would at the very least recommend running the source code through an automatic formatter such as black. This was the first thing I did when digging into the code myself, and it significantly improved its readability. In terms of refactoring to make use of more pythonic solutions, this wouldn’t really impact the code semantically, so I don’t think it is important enough to invest the time into.

Another critical issue I would like to see/help address is to properly document all functions/methods within the package, not just the high-level interface functions. Not only would it be useful to fully describe all function arguments, there were a number of places where I found it difficult to map what was being calculated to equations set out in the original papers. In particular, the source vector functions, and especially the m2/m6 components of the propagate functions. By and large, I was able to map most things to the equation definitions and confirm everything was as expected, but there were one or two places I wanted to better understand.

It might also be nice to provide the user with some base plotting utilities, but I believe what each function returns to be sufficiently documented that it is not unreasonable to expect a user to handle this themself.

I have some further notes, but I’m not 100% sure how exactly to proceed. I suspect it would be easiest if I opened a pull request and referenced it to this review – would that be correct, @leouieda? These primarily expand on some of the notes already made above.

Summary

Hopefully I've got the right idea with how I've approached this review. If you have any questions, I'm more than happy to discuss here. I'm also happy to open GitHub issues to drill down into some of the specifics of what I have outlined above.

hfmark commented 2 years ago

Hi @valentineap - here are some more comments for you! I'll try not to be too repetitive, but I've got some overlap with @hemmelig's thoughts (plus a few other small things). Overall, I think this is a really neat tool, and as someone who has spent far too much time trying to compile and run reflectivity codes, I appreciate how easy it is to install and run.

Code functionality

I'm also a bit concerned about the test failing. I wonder if the finite difference approximation for phi is missing a 1/r term somewhere? Seems more likely to be something coordinate-related like that than an actual failure of the code.

Documentation

It doesn't seem like you anticipate adding much to this package in the future, but I'd also endorse a line to the readme inviting people to (a) contact the authors or (b) submit a pull request if they want to contribute to the package.

The example provided in the documentation under "Calculating seismic observables" places the receivers on the water layer, which the code doesn't support. I know the next doc page has a working and fully annotated example, but it can't hurt to have this one also be fully functional.

Installation is smooth and the dependencies are, as promised, minimal. Still, it would be good to have a bulleted list of those in the readme and/or the documentation, including any "optional" dependencies like tqdm and matplotlib.

The paper

The "Statement of need" strongly emphasizes pyprop8's potential as a simple, lightweight tool for teaching and testing, but the "Applications" section lists some real-world research applications of the underlying methodology. When you say that "The computational approach described by O'Toole and Woodhouse (2011) (but not the pyprop8 implementation)" has been used for these things, does that mean that pyprop8 is implementing a simplified version of the O'Toole and Woodhouse formulation, or just that the original prop8 was used instead of this particular code? I'm guessing the latter, but given how strongly the Statement of need emphasizes teaching/outreach uses rather than research, it would be good to clarify that the methodology is fully implemented and the fact that it's easy to use is a nice bonus.

A suggestion which you can take or leave:

Any thoughts on adding some sort of LayeredStructureModel.earth_flattening_transform() method? For body waves, at least, it's easy to do (e.g. Chapman and Orcutt 1985), and it could give users an option for working with slightly larger ranges.

Most of these comments are fairly minor, but I'll put some of the small documentation things in a pull request and open an issue for the test failure since that seems to be the preferred way for JOSS.

valentineap commented 2 years ago

@hemmelig @hfmark Thank you both for taking the time to do this, and for your detailed comments -- much appreciated. I will take a few days to digest before I respond to anything.

@hemmelig The other reference for microseismic applications would be Tom O'Toole's PhD thesis: record here. IIRC, he had an unpublished chapter looking at Preese Hall fracking. Unfortunately it does not seem to be publicly available in anything other than dead-tree format.

@leouieda How does this work on a technical level -- i.e. how do you want me to keep track of what I've done to address each comment?

leouieda commented 2 years ago

@hemmelig @hfmark thank you for the detailed reviews and suggestions! πŸŽ‰

I have some further notes, but I’m not 100% sure how exactly to proceed. I suspect it would be easiest if I opened a pull request and referenced it to this review – would that be correct, @leouieda? These primarily expand on some of the notes already made above.

Yes, you are definitely encouraged to open issues and PRs (if you're feeling generous) directly in the source repository.

How does this work on a technical level -- i.e. how do you want me to keep track of what I've done to address each comment?

@valentineap my recommendation would be for you or the reviewers to open issues for each of the tasks outlined in the reviews (fix the failing test, expanding the example documentation, creating community guidelines, code formatting, documenting internal functions, edits to the paper, etc).

When opening issues or PRs, please include openjournals/joss-reviews#4217 in the description so that it links back to the review and is more easily tracked.

It doesn't seem like you anticipate adding much to this package in the future

Regardless, we still require the minimum community guidelines to be present for acceptance. See https://joss.readthedocs.io/en/latest/review_criteria.html#community-guidelines. Personally, I would recommend adding 2 files:

@valentineap please keep me and the reviewers informed of the progress and feel free to ping us in issues/PRs if needed. As I said, the JOSS review is usually interactive with all of us working towards making sure the software meets the requirements.

valentineap commented 2 years ago

@hemmelig @hfmark @leouieda Many thanks again for all your comments above. Apologies for being slow to respond. Hopefully I will have a bit more time in the next couple of weeks to make some progress on this.

As you will see I have created a number of issues that I hope capture the salient points of your reviews. Please let me know if you think I have missed something (or just create additional issues!). I will add any comments in response and document the actions taken within the issue tracker.

leouieda commented 2 years ago

Thank you for the update @valentineap! Take your time and let me know if you need any guidance on any of those issues.

leouieda commented 2 years ago

πŸ‘‹πŸΎ Hi @valentineap, just checking in to see how this is progressing. I see you've closed a lot of the open issues related to the review.

Just wanted to give a heads up that I'll be on off on vacation between 4-27 July and won't be available. If you feel like you can wrap this up in the meantime, I can try to arrange for another editor to finalise the review while I'm away. Otherwise, we can pick this up when I return.

valentineap commented 2 years ago

Hi @leouieda, thanks for the info.

I think the end is in sight with this one, but there are a few bits and pieces I still need to tidy up. I'm happy for it to wait until you get back if that's easiest. Hope you have a good break!

leouieda commented 2 years ago

Thank you @valentineap πŸ‘πŸΎ I'll ping you again when I'm back to see how things have progressed.

valentineap commented 2 years ago

Hi @leouieda, I think I've finally dealt with all the comments and suggestions. Please see issues tagged joss-paper-reviewer-comment on the issue tracker for my responses.

Note that I have not yet updated PyPI package, in case any further changes are needed.

Let me know if you need anything else from me!

And just for fun:

https://user-images.githubusercontent.com/4602811/183925321-20d11157-955a-4016-88c4-2499a0ccde5e.mp4

valentineap commented 2 years ago

@editorialbot generate pdf

editorialbot commented 2 years ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

hemmelig commented 2 years ago

I've had a look through the changes made in response to my review and am happy to say everything has been addressed. The new tests and the Idaho video example are great! I installed a fresh copy and ran the examples and everything worked just fine for me (Ubuntu 20.04, Python 3.8).

I realise the internal documentation requires a lot of time and effort, but it has definitely made it much clearer how the different functions relate to the ideas/maths set out in the papers (particularly with the additional notes)!

Conor

hfmark commented 2 years ago

I'll echo Conor - @valentineap, it looks like you've addressed all my comments and then some. The updated paper text looks great, and the code all seems to work on my system (Ubuntu 18.04, Python 3.9) including the excellent new test suite. My checklist is all set!

leouieda commented 2 years ago

@hemmelig @hfmark thank you for your thorough and thoughtful reviews! I just managed to catch up on all the changes and I agree with the reviewers that this submission meets all our criteria.

@valentineap I'll do a final editorial read of the paper now before we proceed with acceptance. I'll post the final checklist for you shortly as well. Well done and thank you for your (immense) patience so far πŸ™‚

valentineap commented 2 years ago

@editorialbot generate pdf

editorialbot commented 2 years ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

leouieda commented 2 years ago

@valentineap I'm very happy with the state of the paper/software and the reviewers are satisfied as well. It's my pleasure to move on to acceptance of your paper πŸŽ‰

There just a few of things that you'll need to do:

  1. Double check author names and affiliations (including ORCIDs)
  2. Make a release of the software with the latest changes from the review (including any editorial fixes). This is the version that will be used in the JOSS paper. Post the version number here once it's available.
  3. Archive the release on Zenodo or equivalent repository and post the DOI here.
  4. Make sure that the title and author list (including ORCIDs) in the archive above match those in the JOSS paper.

Once those are done, we can move on to publication! Let me know if you need any help with these or have any questions.

valentineap commented 2 years ago

Excellent! Thanks again @leouieda, @hfmark and @hemmelig - I really appreciate your input into this.

  1. I confirm that the authors and ORCIDs in the manuscript are correct
  2. After a bit of PyPI and Zenodo silliness I think we are at version 1.1.2 which is now available with pip
  3. DOI
  4. After some more silliness... I think they match!

So -- I hope that's all done, but let me know if there are any problems.

leouieda commented 2 years ago

@editorialbot set 10.5281/zenodo.7019949 as archive

editorialbot commented 2 years ago

Done! Archive is now 10.5281/zenodo.7019949

leouieda commented 2 years ago

@editorialbot set v1.1.2 as version

editorialbot commented 2 years ago

Done! version is now v1.1.2

leouieda commented 2 years ago

@editorialbot recommend-accept

editorialbot commented 2 years ago
Attempting dry run of processing paper acceptance...
editorialbot commented 2 years ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1111/j.1365-246X.2011.05210.x is OK
- 10.1111/j.1365-246X.2012.05608.x is OK
- 10.1029/2012GL054209 is OK
- 10.1093/gji/ggt473 is OK
- 10.3997/2214-4609.20142350 is OK
- 10.1785/0120150010 is OK
- 10.1093/gji/ggw108 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.1093/gji/ggac151 is OK

MISSING DOIs

- None

INVALID DOIs

- None
editorialbot commented 2 years ago

:wave: @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof :point_right::page_facing_up: Download article

If the paper PDF and the deposit XML files look good in https://github.com/openjournals/joss-papers/pull/3466, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

arfon commented 2 years ago

@editorialbot accept

editorialbot commented 2 years ago
Doing it live! Attempting automated processing of paper acceptance...
editorialbot commented 2 years ago

🐦🐦🐦 πŸ‘‰ Tweet for this paper πŸ‘ˆ 🐦🐦🐦

editorialbot commented 2 years ago

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited :point_right: https://github.com/openjournals/joss-papers/pull/3473
  2. Wait a couple of minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.04217
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! πŸŽ‰πŸŒˆπŸ¦„πŸ’ƒπŸ‘»πŸ€˜

Any issues? Notify your editorial technical team...

arfon commented 2 years ago

@hemmelig, @hfmark – many thanks for your reviews here and to @leouieda for editing this submission! JOSS relies upon the volunteer effort of people like you and we simply wouldn't be able to do this without you ✨

@valentineap – your paper is now accepted and published in JOSS :zap::rocket::boom:

editorialbot commented 2 years ago

:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.04217/status.svg)](https://doi.org/10.21105/joss.04217)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.04217">
  <img src="https://joss.theoj.org/papers/10.21105/joss.04217/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.04217/status.svg
   :target: https://doi.org/10.21105/joss.04217

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following: