openjournals / joss-reviews

Reviews for the Journal of Open Source Software
Creative Commons Zero v1.0 Universal
717 stars 38 forks source link

[REVIEW]: The Kestrel software for simulations of morphodynamic Earth-surface flows #6079

Closed editorialbot closed 8 months ago

editorialbot commented 10 months ago

Submitting author: !--author-handle-->@jakelangham<!--end-author-handle-- (Jake Langham) Repository: https://github.com/jakelangham/kestrel/ Branch with paper.md (empty if default branch): Version: v1.0.0 Editor: !--editor-->@crvernon<!--end-editor-- Reviewers: @mdpiper, @jatkinson1000 Archive: 10.5281/zenodo.10477693

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/36cb6c0373fcbe7d38598fcfe878ab58"><img src="https://joss.theoj.org/papers/36cb6c0373fcbe7d38598fcfe878ab58/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/36cb6c0373fcbe7d38598fcfe878ab58/status.svg)](https://joss.theoj.org/papers/36cb6c0373fcbe7d38598fcfe878ab58)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@mdpiper & @jatkinson1000, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @crvernon know.

✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨

Checklists

πŸ“ Checklist for @jatkinson1000

πŸ“ Checklist for @mdpiper

editorialbot commented 10 months ago

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf
editorialbot commented 10 months ago
Software report:

github.com/AlDanial/cloc v 1.88  T=0.07 s (939.0 files/s, 346252.4 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Fortran 90                      31           3102           2525          12230
m4                               9            362             16           1875
reStructuredText                12            726            105           1321
C++                              2            180            157            835
Julia                            2             95             94            501
TeX                              1             16              0            233
Markdown                         2             38              0            229
C/C++ Header                     2             52            345            144
make                             3              9              7             43
Bourne Shell                     1             10              6             41
Python                           1             14             13             29
DOS Batch                        1              8              1             26
YAML                             2              6             24             25
-------------------------------------------------------------------------------
SUM:                            69           4618           3293          17532
-------------------------------------------------------------------------------

gitinspector failed to run statistical information for the repository
editorialbot commented 10 months ago

Wordcount for paper.md is 1569

editorialbot commented 10 months ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.5194/gmd-2023-80 is OK
- 10.5194/egusphere-2023-1301 is OK
- 10.1016/j.jvolgeores.2004.06.014 is OK
- 10.1007/978-3-030-34356-9_10 is OK
- 10.3389/feart.2020.00275 is OK
- 10.1098/rspa.2013.0819 is OK
- 10.1098/rspa.2013.0820 is OK
- 10.1029/2019JF005204 is OK
- 10.1007/s10346-021-01733-2 is OK
- 10.1016/j.coldregions.2010.04.005 is OK
- 10.1016/j.ijdrr.2022.103338 is OK
- 10.1002/2013RG000447 is OK
- 10.1061/(ASCE)0733-9429(2004)130:7(689) is OK
- 10.1017/jfm.2021.235 is OK
- 10.1017/s0022112098002250 is OK
- 10.1007/b138657 is OK
- 10.1029/97RG00426 is OK
- 10.1016/0377-0273(90)90082-Q is OK
- 10.1002/esp.1127 is OK

MISSING DOIs

- None

INVALID DOIs

- None
editorialbot commented 10 months ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

crvernon commented 10 months ago

πŸ‘‹ @jakelangham, @mdpiper, and @jatkinson1000 - This is the review thread for the paper. All of our communications will happen here from now on.

Please read the "Reviewer instructions & questions" in the first comment above.

Both reviewers have checklists at the top of this thread (in that first comment) with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.

The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention https://github.com/openjournals/joss-reviews/issues/6079 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.

We aim for the review process to be completed within about 4-6 weeks but please make a start well ahead of this as JOSS reviews are by their nature iterative and any early feedback you may be able to provide to the author will be very helpful in meeting this schedule.

jatkinson1000 commented 10 months ago

Review checklist for @jatkinson1000

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

mdpiper commented 10 months ago

Review checklist for @mdpiper

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

jatkinson1000 commented 10 months ago

@jakelangham a couple of points as I work through this:

I'll now read the paper in more detail and follow up with related comments.

jakelangham commented 10 months ago

@jatkinson1000 I'll work through these in due course.

I actually didn't realise a statement of need was required for the docs, but it makes complete sense.

Regarding API documentation, it would be good to clarify this before proceeding much further and to get your opinion. We thought about this issue before submission and came to the view that an independent auto-generated API documentation was a bit too heavy duty for our purposes. While it is a sizeable codebase, the control flow is pretty linear and the routines that are 'library-like' in their design are probably in the minority. This is my own prejudice, but I have seen a fair few research codes where the auto-generated API docs were totally unhelpful compared with just reading the source code.

I remember reading somewhere in the JOSS docs that for Fortran/C++ codes, having separate API documentation was optional (but encouraged), provided that the source code was well commented. So at the time we made the decision to concentrate on improving our source, since that seemed like a more obvious benefit to us, rather than spending time on a more formal API. We were aiming to make sure every module had a explanatory header + each of the most important routines, so hopefully you should find that this is the case. Let me know whether you think this is sufficient.

jatkinson1000 commented 10 months ago

@jakelangham with regards to the paper:

Overall I think this is well-written and organised software, and it is clearly a significant piece of work. I am more than happy to recommend publication in JOSS subject to resolving the minor issues raised above, the points which I believe will strengthen the paper, and any comments from @mdpiper.

jatkinson1000 commented 10 months ago

Regarding API documentation, it would be good to clarify this before proceeding much further and to get your opinion. We thought about this issue before submission and came to the view that an independent auto-generated API documentation was a bit too heavy duty for our purposes. While it is a sizeable codebase, the control flow is pretty linear and the routines that are 'library-like' in their design are probably in the minority. This is my own prejudice, but I have seen a fair few research codes where the auto-generated API docs were totally unhelpful compared with just reading the source code.

@jakelangham Yep, I agree, and I don't think it is necessary to add full API documentation here to satisfy review. I have taken a look at the source files and they are very well commented, in my opinion, and should allow users that wanted to adapt the code in detail to get started.

That said, it is indeed encouraged, useful as projects/collaborators grow (speaking from experience with large climate models...), and easier to get on top of the sooner you start! I also feel when done well for appropriate routines/functions it can be a useful tool, but like many things it can be done badly! So do perhaps consider it in the future πŸ˜ƒ

jakelangham commented 10 months ago

@jatkinson1000 A few updates...

@jakelangham a couple of points as I work through this:

* [ ]  I had to do some work to run the tests and have opened issue [Test script failure on mac jakelangham/kestrel#12](https://github.com/jakelangham/kestrel/issues/12)

https://github.com/jakelangham/kestrel/commit/4a3799b9f39d25e88d01aebeb3dbb44fae982d0a fixes this I believe.

* [ ]  The web docs have no information about running the tests (only the README.md).
  Could you repeat the testing info on the web docs please?

Done, in https://github.com/jakelangham/kestrel/commit/f80a28837dc4c4428909529e25944778876851bf

* [ ]  Similarly, the contribution guidelines are only available on the web documentation.
  Consider adding a short section (even if just a link) in the GitHub README.md

Done, in https://github.com/jakelangham/kestrel/commit/7a7c83ccc9b160c9cfc914b3a067e06990b71e8f

jakelangham commented 10 months ago
* [ ]  Whilst there _is_ a statement of need it is very brief.
  Consider expanding it slightly with more description. This should also help in making the software discoverable.
  Consider also adding tags and a website link to the GitHub 'about'.

@jatkinson1000 We have updated the github README.md now and added some tags to address this suggestion.

jakelangham commented 10 months ago

@jatkinson1000 See replies below...

@jakelangham with regards to the paper:

* [ ]  Not super important, but it is not clear to me why the project is called `kestrel`.
  It does not appear to be an acronym. Explaining this would be nice, and I feel somewhat useful.

There isn't really a scientific reason for the name choice. But it's typical to view simulation data from a bird's-eye perspective. Kestrels spend a lot of time hovering in place when they're hunting so it felt appropriate to me, amongst other choices. Not sure I want to include that in the paper though.

* [ ]  Line 41 there is an issue with reference `Iverson2015`

Should be fixed by https://github.com/jakelangham/kestrel/commit/ec77d54b6d064611f81c50afd1afe1342c80efaf.

* [ ]  I don't understand the date range in reference `Mergili (2014-2023)`. Is this a typo/bibtex error?

https://github.com/jakelangham/kestrel/commit/ee063490212a986c79639e39edb3b2cd381d7182 removes the 2014- part. This is one of several website references I included when citing alternative codes and to be honest, I don't know what the right convention is for the year of publication. Listing them all as 2023 doesn't feel quite right when some codes have been around for a decade or so.

* [ ]  There are several alternative codes listed, including a number which are open source.
  The principal argument for kestrel seems to be that 'others are not well documented'. I would really like to see justification beyond this - why did you choose to create a new model rather than contributing to improve the documentation of an existing open-source model?
  This is made clearer from line 91, but I suggest restructuring to make it clear when talking about other codes that kestrel has different physics (Langham et al. (2023)) to improve morphodynamic modelling and ease of use. This is, in my view, the main motivation.
  As far as I can tell this is companion software to a new modelling approach described in Langham et al. (2023), and hence not available in the other codes.
  Being better documented than others is, to me, a corollary of this being quality research software, not a primary motivation.

I've tried to address this in https://github.com/jakelangham/kestrel/commit/61aa2b3cac2fe59a2c2d83e89c365767c7dcc887. My inclination was to preserve the 'line 91' paragraph as is, since it follows the description of the equations, but highlight up front earlier that the model framework and scheme are new. It is difficult to go in to much more detail without making things very technical, which I think is better suited to the companion article that we refer to.

* [ ]  The final sentence of the penultimate paragraph (starting line 98 "These include...") doesn't read right to me and seems out of place. What is "These" referring to?

It's meant to follow directly on from (2) in the last sentence. Perhaps https://github.com/jakelangham/kestrel/commit/776da5a8afb9e0992fb0043daf6d2075de959b09 makes it easier to scan?

* [ ]  When describing the 'many free parameters' perhaps refer to the online input documentation which has longer description of these?

Done - https://github.com/jakelangham/kestrel/commit/7ec1577eae91fa81d3ea8b0e39d324cbd7d3ba26

* [ ]  The paper can, at times, sound a little defensive of the fact that kestrel is more simple than other codes.
  Perhaps include a brief point about the importance of reduced order modelling for physical understanding (you do hint at this to a degree). I have been in a similar situation with codes I have written so am familiar.

I've reworded things a bit here and there to try and address this - for example in https://github.com/jakelangham/kestrel/commit/32ea8c5b464c97a23dc5bf96f09bd60058ad41a4 and elsewhere - see what you think.

The only additional comment to address is the inclusion of the schematic figure from the docs. The figure itself would need to be modified a little to align it with the paper. I'm not against this in principle, assuming it doesn't conflict with any length requirements, but would prefer to wait for @mdpiper's review before doing this.

jatkinson1000 commented 10 months ago

Thanks @jakelangham, that's looking good.

https://github.com/jakelangham/kestrel/commit/ee063490212a986c79639e39edb3b2cd381d7182 removes the 2014- part. This is one of several website references I included when citing alternative codes and to be honest, I don't know what the right convention is for the year of publication. Listing them all as 2023 doesn't feel quite right when some codes have been around for a decade or so.

Right, I follow. My understanding is that you cite the 'date accessed' for web resources. Ideally there would be a JOSS/GMD paper or repository with a DOI you could also cite, but these are not common for older (and indeed many current :cry:) codes. Here I would go with 2014 if it hasn't changed since then, or 2023 if it has. Looking at avaflow it has release numbers, so I would add that to the citation with date 2023 to be clear what you are comparing this work to.

The figure itself would need to be modified a little to align it with the paper. I'm not against this in principle, assuming it doesn't conflict with any length requirements, but would prefer to wait for @mdpiper's review before doing this.

I personally feel it would greatly improve the readability of the paper, but happy to wait for @mdpiper 's opinion.

I've tried to address this in https://github.com/jakelangham/kestrel/commit/61aa2b3cac2fe59a2c2d83e89c365767c7dcc887. My inclination was to preserve the 'line 91' paragraph as is, since it follows the description of the equations, but highlight up front earlier that the model framework and scheme are new.

I agree with not being too technical early on, but my issue at the moment is that when reading the paper it comes across that the principal reason for developing kestrel is better documentation and an inability to understand other models. In reality I think the case is that kestrel models different physics to the alternatives, and this is why a new code has been written. I believe it would greatly help the paper to make this clear when introducing the other models - you can leave the technical discussion of the physics until later, but a layperson's description something like "kestrel is capable of simulating a wide range of flow types from fluid to granular, not possible with a single alternative code, and implements the new modelling approach defined in Langham (2023) and expanded upon below" (this may not be quite correct as I am not overly familiar, but you get the idea). I would also put this ahead of the comments on documentation as I think it is a more important motivation.

jakelangham commented 10 months ago

@jatkinson1000 Ok, I take the point - it should be possible to rewrite closer to what you're suggesting, with the caveat that its difficult to compare all these codes directly. The phrase about documentation could even be removed to be honest. Like the figure issue, I think I'll park this for now and come to it with a fresh perspective when addressing the next round of review comments.

crvernon commented 10 months ago

:wave: @jakelangham , @mdpiper , and @jatkinson1000

Great progress so far on this review! Could you provide an update here to where things stand when you have a moment? Thanks!

mdpiper commented 10 months ago

@crvernon I have just a few more items on the checklist to evaluate. I'll finish my review this week.

jatkinson1000 commented 10 months ago

@crvernon I have carried out my review and suggested a few improvements. Most of these have already been made, with a few awaiting @mdpiper's review/comments. Once these are done I will check again and we can hopefully get this published!

mdpiper commented 10 months ago

@crvernon I've completed my review.

@jatkinson1000 Thank you for your very careful and thorough review. You caught many issues before I got to them.

@jakelangham It was a pleasure to learn about Kestrel. I found the code easy to read with an economical amount of comments (I like code to speak for itself). The documentation was well-written, easy to follow, and thorough. Use of GNU autotools and text-based configuration will make the code easier to maintain over time. As a professional consumer of other people's software, I often make snap judgments based on the organization of a project's repository and the clarity of its README. Yours easily passes.

I have a few comments that don't rise to the level of issues (sorry, JOSS review rules). These comments come from the perspective of someone (like me!) considering using your model, and I hope to make it easier for them. The comments do not affect my review, and I don't require a response; I merely ask you to consider them.

  1. It feels awkward to run the executable from the src/ directory. It might be good to remind users in the Installation section that they can use configure --prefix, then execute make install to put the kestrel executable in the path.
  2. Requiring GCC is a limitation. What about users who have only MSVC or Clang/Flang? Although I didn't try super hard, I didn't get Kestrel to compile on my Mac with Clang. (I didn't try Windows, because I'm slightly allergic to it.) I ended up using Docker to build, test, and run Kestrel. I mention this because often something like this is enough for me to turn away from a software package to an equavalent that does work. There are usually many options available, and I'm looking for the path of least resistance.
  3. I wish Kestrel was packaged. This would obviate my two previous comments. I don't have a great recommendation for this. I love to use conda. Others use fpm.
  4. I agree with @jatkinson1000 that using a documentation system like FORD would be good.
  5. Providing a couple example Python or Julia scripts for analyzing the output of the two example cases would be really helpful to people, I think. (I acknowledge the existence of the gnuplot example in the docs.)
  6. In your published journal articles, you describe the physics of Kestrel and the numerical techniques used. I'd hoped here that you could describe the software development choices you made in creating Kestrel. I know this is a bit vague, but maybe things like why Fortran?
  7. For your future self, set up continous integration. I see you're already using GitHub Actions for the paper. It would be straightforward to add a workflow to build/install/test the code.

OK, I think that's it. Thank you for a great contribution.

crvernon commented 9 months ago

Thanks @mdpiper !

jakelangham commented 9 months ago

@jatkinson1000

I've made a few improvements which I think address the remainder of your unresolved comments. Specifically, https://github.com/jakelangham/kestrel/commit/299d518c6e5bbe98a1c89315583254c5b747e3e5 adds a slightly revised version of the flow diagram from the docs, https://github.com/jakelangham/kestrel/commit/3c66587314e9de0f4127de1b4cc03a8981105911 rejigs the statement of need to clarify our contribution a bit better and https://github.com/jakelangham/kestrel/commit/64585b34e81184dff76e72f15662115bac286379 adds version numbers to software references where I could find them.

jatkinson1000 commented 9 months ago

Hi @crvernon @jakelangham @mdpiper

I'm happy with the paper and software now - both are well-written and I recommend publication in the latest state, hopefully in time for the end of the year! πŸš€

Considering @mdpiper's comments I don't see anything to prevent publication, but encourage packaging in future (fpm is a good suggestion), and agree that the option of setting the executable location somewhere other than src/ would be a nice additional feature. Whilst wide compiler support is nice, depending on GCC is at least much less restrictive than depending on the intel suite etc.

jakelangham commented 9 months ago

Thanks for your kind comments @mdpiper. I broadly agree with all your numbered comments, though a couple I'd view as future aspirations for us rather than things that will be easily addressed in the short term. I think I've addressed everything that would potentially impact the actual manuscript here (i.e. point 6) and ran out of time today to do much with point 5 - if that's a deal-breaker let me know and I can get to it this week. Please see individual responses below...

  1. It feels awkward to run the executable from the src/ directory. It might be good to remind users in the Installation section that they can use configure --prefix, then execute make install to put the kestrel executable in the path.

Agreed and addressed in https://github.com/jakelangham/kestrel/commit/d55bab54566857d79ff1269b1b5930f73dc4574d

  1. Requiring GCC is a limitation. What about users who have only MSVC or Clang/Flang? Although I didn't try super hard, I didn't get Kestrel to compile on my Mac with Clang. (I didn't try Windows, because I'm slightly allergic to it.) I ended up using Docker to build, test, and run Kestrel. I mention this because often something like this is enough for me to turn away from a software package to an equavalent that does work. There are usually many options available, and I'm looking for the path of least resistance.

This is a fair point. However, at time of writing I don't have access to a Windows/MSVC system to work with. If there are users in the future that want to use this platform I'll definitely consider trying to support it. The option to compile on mac seems more pressing. It seems that @jatkinson1000 was able to get things working on a mac and we were able to fix some issues that arose in the process. I'm guessing this was via homebrew gfortran (though quite possibly the C++ portions of the code ended up being compiled by clang anyway). If @jatkinson1000 has any pertinent details to add regarding the install process, I'd happily add them to the documentation.

  1. I wish Kestrel was packaged. This would obviate my two previous comments. I don't have a great recommendation for this. I love to use conda. Others use fpm.

In fact our plan was to prepare a proper 'release' after the paper was accepted in case the reviews uncovered up any critical issues. If you're happy with that strategy then that's what we'll do.

  1. I agree with @jatkinson1000 that using a documentation system like FORD would be good.

This is definitely on the cards for the future, especially if we are able to grow our number of contributors.

  1. Providing a couple example Python or Julia scripts for analyzing the output of the two example cases would be really helpful to people, I think. (I acknowledge the existence of the gnuplot example in the docs.)

Agreed. I'll discuss this a bit with my co-author @markwoodhouse and think about what's best to put in from a potential user's point of view.

  1. In your published journal articles, you describe the physics of Kestrel and the numerical techniques used. I'd hoped here that you could describe the software development choices you made in creating Kestrel. I know this is a bit vague, but maybe things like why Fortran?

In terms of conscious decision making, there are a few things in this vein that we've already mentioned early on in the preprint - the modular design of key 'ingredients'; our attempt to keep the dependencies list short and the choice of NetCDF for the data format. One other thing is the memory management which a few readers might be interested in. I've added a little more material in this direction in https://github.com/jakelangham/kestrel/commit/a2cb13709d33de0375c9f0dffee4ae690befd9d5

Though I'm quite partial to Fortran, I can't think of a publish-ably defensible reason to favour it wholeheartedly over writing the whole code in C++. Both options would seem to have pros and cons. Of course at this point, there's no turning back for us. ;)

  1. For your future self, set up continous integration. I see you're already using GitHub Actions for the paper. It would be straightforward to add a workflow to build/install/test the code.

100%. In fact I raised an issue about this the other day: https://github.com/jakelangham/kestrel/issues/16, just need to get my head around setting it up.

jatkinson1000 commented 9 months ago

Nothing special on mac, but yes, I used gcc installed via homebrew (same for netcdf and other dependencies).

RE: Fortran vs C++ as a numeric code there could perhaps be an argument made based on Fortran's native array handling/capabilities?

jakelangham commented 9 months ago

Ok thanks, I have added a comment about homebrew to the dependencies section of the docs https://github.com/jakelangham/kestrel/commit/1ef52018f331c4c71d34385279f3eb22a1092d51

jatkinson1000 commented 9 months ago

Hi @crvernon @mdpiper @jakelangham, Just doing some new year cleaning and is there anything preventing this from being published and closed? I am certainly happy with the current revised state.

crvernon commented 9 months ago

@editorialbot check references

editorialbot commented 9 months ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.5194/gmd-2023-80 is OK
- 10.5194/egusphere-2023-1301 is OK
- 10.1016/j.jvolgeores.2004.06.014 is OK
- 10.1007/978-3-030-34356-9_10 is OK
- 10.3389/feart.2020.00275 is OK
- 10.1098/rspa.2013.0819 is OK
- 10.1098/rspa.2013.0820 is OK
- 10.1029/2019JF005204 is OK
- 10.1007/s10346-021-01733-2 is OK
- 10.1016/j.coldregions.2010.04.005 is OK
- 10.1016/j.ijdrr.2022.103338 is OK
- 10.1002/2013RG000447 is OK
- 10.1061/(ASCE)0733-9429(2004)130:7(689) is OK
- 10.1017/jfm.2021.235 is OK
- 10.1017/s0022112098002250 is OK
- 10.1007/b138657 is OK
- 10.1029/97RG00426 is OK
- 10.1016/0377-0273(90)90082-Q is OK
- 10.1002/esp.1127 is OK

MISSING DOIs

- None

INVALID DOIs

- None
crvernon commented 9 months ago

@editorialbot generate pdf

editorialbot commented 9 months ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

crvernon commented 9 months ago

:wave: @jakelangham this submission is very well done! We are almost there! Next is just setting up the archive for your new release.

We want to make sure the archival has the correct metadata that JOSS requires. This includes a title that matches the paper title and a correct author list.

So here is what we have left to do:

I can then move forward with accepting the submission.

jatkinson1000 commented 9 months ago

Hi @crvernon I think these are tasks for @jakelangham (the author).

crvernon commented 9 months ago

@jatkinson1000 Ha! Yes! My apologies! I just edited my original request. Thank you!

jakelangham commented 9 months ago

@crvernon Have done this now - the doi is 10.5281/zenodo.10477693. Let me know if you need anything more.

crvernon commented 9 months ago

@editorialbot check references

editorialbot commented 9 months ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.5194/gmd-2023-80 is OK
- 10.5194/egusphere-2023-1301 is OK
- 10.1016/j.jvolgeores.2004.06.014 is OK
- 10.1007/978-3-030-34356-9_10 is OK
- 10.3389/feart.2020.00275 is OK
- 10.1098/rspa.2013.0819 is OK
- 10.1098/rspa.2013.0820 is OK
- 10.1029/2019JF005204 is OK
- 10.1007/s10346-021-01733-2 is OK
- 10.1016/j.coldregions.2010.04.005 is OK
- 10.1016/j.ijdrr.2022.103338 is OK
- 10.1002/2013RG000447 is OK
- 10.1061/(ASCE)0733-9429(2004)130:7(689) is OK
- 10.1017/jfm.2021.235 is OK
- 10.1017/s0022112098002250 is OK
- 10.1007/b138657 is OK
- 10.1029/97RG00426 is OK
- 10.1016/0377-0273(90)90082-Q is OK
- 10.1002/esp.1127 is OK

MISSING DOIs

- None

INVALID DOIs

- None
crvernon commented 9 months ago

@editorialbot generate pdf

editorialbot commented 9 months ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

crvernon commented 9 months ago

@editorialbot set v1.0.0 as version

editorialbot commented 9 months ago

Done! version is now v1.0.0

crvernon commented 9 months ago

@editorialbot set 10.5281/zenodo.10477693 as archive

editorialbot commented 9 months ago

Done! archive is now 10.5281/zenodo.10477693

crvernon commented 9 months ago

πŸ‘‹ - @jakelangham I am recommending that this submission be accepted for publication. An EiC will review shortly and if all goes well this will go live soon! Thanks to @mdpiper and @jatkinson1000 for a timely and constructive review!

crvernon commented 9 months ago

@editorialbot recommend-accept

editorialbot commented 9 months ago
Attempting dry run of processing paper acceptance...
editorialbot commented 9 months ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.5194/gmd-2023-80 is OK
- 10.5194/egusphere-2023-1301 is OK
- 10.1016/j.jvolgeores.2004.06.014 is OK
- 10.1007/978-3-030-34356-9_10 is OK
- 10.3389/feart.2020.00275 is OK
- 10.1098/rspa.2013.0819 is OK
- 10.1098/rspa.2013.0820 is OK
- 10.1029/2019JF005204 is OK
- 10.1007/s10346-021-01733-2 is OK
- 10.1016/j.coldregions.2010.04.005 is OK
- 10.1016/j.ijdrr.2022.103338 is OK
- 10.1002/2013RG000447 is OK
- 10.1061/(ASCE)0733-9429(2004)130:7(689) is OK
- 10.1017/jfm.2021.235 is OK
- 10.1017/s0022112098002250 is OK
- 10.1007/b138657 is OK
- 10.1029/97RG00426 is OK
- 10.1016/0377-0273(90)90082-Q is OK
- 10.1002/esp.1127 is OK

MISSING DOIs

- None

INVALID DOIs

- None
editorialbot commented 9 months ago

:warning: Error preparing paper acceptance. The generated XML metadata file is invalid.

IDREFS attribute rid references an unknown ID "eqU003AgoverningU0020eqsU00201"
crvernon commented 9 months ago

@xuanxu would you mind taking a look at the following XML metadata error that raised on recommend-accept please?:

⚠️ Error preparing paper acceptance. The generated XML metadata file is invalid.

IDREFS attribute rid references an unknown ID "eqU003AgoverningU0020eqsU00201"