openjournals / joss-reviews

Reviews for the Journal of Open Source Software
Creative Commons Zero v1.0 Universal
697 stars 36 forks source link

[REVIEW]: einprot: flexible, easy-to-use, reproducible workflows for statistical analysis of quantitative proteomics data #5750

Closed editorialbot closed 10 months ago

editorialbot commented 12 months ago

Submitting author: !--author-handle-->@csoneson<!--end-author-handle-- (Charlotte Soneson) Repository: https://github.com/fmicompbio/einprot Branch with paper.md (empty if default branch): joss Version: v0.7.7 Editor: !--editor-->@fboehm<!--end-editor-- Reviewers: @AnthonyOfSeattle, @ByrumLab Archive: 10.5281/zenodo.8298657

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/200654a73547392769c222793680a83a"><img src="https://joss.theoj.org/papers/200654a73547392769c222793680a83a/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/200654a73547392769c222793680a83a/status.svg)](https://joss.theoj.org/papers/200654a73547392769c222793680a83a)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@AnthonyOfSeattle & @ByrumLab, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @fboehm know.

✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨

Checklists

πŸ“ Checklist for @ByrumLab

πŸ“ Checklist for @AnthonyOfSeattle

editorialbot commented 12 months ago

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf
editorialbot commented 12 months ago
Software report:

github.com/AlDanial/cloc v 1.88  T=0.13 s (868.7 files/s, 278485.0 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
R                               94           1573           5136          21405
Rmd                              5            666           1673           2448
XML                              2              0              0           1182
TeX                              3             60              0            764
Markdown                         4            157              0            346
YAML                             2             23              5            127
SQL                              1              0              0             19
-------------------------------------------------------------------------------
SUM:                           111           2479           6814          26291
-------------------------------------------------------------------------------

gitinspector failed to run statistical information for the repository
editorialbot commented 12 months ago

Wordcount for paper.md is 1402

editorialbot commented 12 months ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1038/nmeth.4256 is OK
- 10.1101/2023.06.26.546625 is OK
- 10.1093/bioadv/vbab041 is OK
- 10.1093/bioinformatics/btaa620 is OK
- 10.1021/acs.jproteome.0c00398 is OK
- 10.1021/acs.jproteome.9b00496 is OK
- 10.1371/journal.pcbi.1009148 is OK
- 10.1093/bioinformatics/btu305 is OK
- 10.1074/mcp.M113.031591 is OK
- 10.1074/mcp.M114.041012 is OK
- 10.1016/j.cell.2015.09.053 is OK
- 10.1083/jcb.200911091 is OK
- 10.1038/nprot.2009.36 is OK
- 10.1074/mcp.RA120.002105 is OK
- 10.3390/proteomes9010015 is OK
- 10.1038/nbt.1511 is OK
- 10.1186/s12864-022-09058-7 is OK
- 10.1101/416511 is OK
- 10.1038/nmeth.3901 is OK
- 10.1016/j.jprot.2020.103669 is OK
- 10.12688/f1000research.14966.1 is OK
- 10.1038/75556 is OK
- 10.1093/genetics/iyad031 is OK
- 10.1093/nar/gky973 is OK
- 10.1093/genetics/iyab222 is OK
- 10.1093/nar/gkn1005 is OK
- 10.1038/nmeth.3252 is OK
- 10.1038/s41592-019-0654-x is OK
- 10.1016/j.molcel.2023.06.001 is OK
- 10.1093/nar/gkac610 is OK
- 10.1021/pr300273g is OK
- 10.1038/nprot.2017.147 is OK
- 10.1021/acs.jproteome.2c00441 is OK
- 10.1016/j.cell.2021.01.004 is OK
- 10.1021/acs.jproteome.2c00390 is OK
- 10.1038/s41586-018-0153-8 is OK
- 10.1371/journal.pcbi.1010752 is OK
- 10.1021/acs.jproteome.2c00812 is OK

MISSING DOIs

- None

INVALID DOIs

- None
editorialbot commented 12 months ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

ByrumLab commented 12 months ago

Review checklist for @ByrumLab

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

AnthonyOfSeattle commented 11 months ago

Review checklist for @AnthonyOfSeattle

Conflict of interest

Code of Conduct

General checks

Functionality

Documentation

Software paper

ByrumLab commented 11 months ago

Hi @csoneson, thank you for the nice proteomics software package. The code is easy to install and customize analysis pipelines, which will be valuable as new mass spec methods arise. The generation of the full reports for each project enhances reproducibility for data sharing. I have a couple of minor comments that may help new users.

  1. I added a minor issue in the github repo to add the other workflow examples link to the ReadMe. https://github.com/fmicompbio/einprot/issues/12

  2. A minor question, why is the object called a SingleCellExperiment? Single cell makes me think of single-cell RNAseq or the development of single-cell proteomics technologies.

csoneson commented 11 months ago

Thanks @ByrumLab!

  1. Thanks for this suggestion - I agree, and I have added the link to the README.
  2. The SingleCellExperiment class is a standard data container from the Bioconductor project - it was designed as an extension to the SummarizedExperiment container, with single-cell data in mind. However, despite the name, it is suitable for storing any rectangular data set (and the advantage compared to the SummarizedExperiment object is that in addition to abundance values and annotations for samples and features, it can also hold low-dimensional representations, in our case obtained by PCA). Rather than defining a new data structure, we decided to make use of this well-established container, since it has everything we need and in addition makes it possible to directly apply many functions from a variety of Bioconductor packages.
ByrumLab commented 11 months ago

Thanks @csoneson for the explanation. I assumed this was the case but I appreciate the explanation.

@fboehm I have no further comments and I have completed the checklist.

csoneson commented 11 months ago

For completeness, I've added an FAQ section to the vignette, and included a note about the motivation for the use of the SingleCellExperiment - thanks @ByrumLab for raising this point!

AnthonyOfSeattle commented 11 months ago

Hi @csoneson, congrats on this well built and complete R package. It is clear that this represents a significant effort born out of years performing fundamental analyses in quantitative proteomics. I believe it will see use both in traditional proteomics labs, who may be looking to automate their initial analyses of data, as well as will in labs taking their first steps into proteomics, who may benefit from the structured approach.

Looking into the code, I can see a github action checking both PRs and merges, which is great. To confirm all dependencies are listed, I installed the package into a clean docker image and ran tests. Everything worked just as expected. It was also good to see multiple types of user input checks at the start of functions, which appear to have feedback for users about types and errors. The package can read the output from multiple types of analysis software into the same Bioconductor object, and reports appear logically laid out and uniform in structure. Overall, I like the consistency of the design of the API. For most of the questions I had about the package, I was able to find the answers in vignette. Seems like there are a lot of customizations that individuals can specify, and you guide them through their options quite thoroughly.

I was particularly interested in diving into the statistical portion of the code. From what I can see, the code has out of the box support for single factor designs with a batch term. If a user has a mutlifactor design, say {male, female}X{control, treated}, is the proper way to handle this data to have a single group column with values chosen from {male_control, male_treated, female_control, female_treated}? I see the following quote in the section The sample annotation table:

This data.frame must have at least two columns, named sample and group, but any additional columns are also supported and will be included in the final SingleCellExperiment object.

Are additional columns being included in statistical test as well? Could a quick answer to that question be stated in the vignette, maybe with an explicit statement about the best way to handle multi-factor designs like this? I expect that there will be some users who will benefit from the protocol being explicitly laid out. In addition, I see two main packages for doing statistical test, both of which seem like good additions. However, I don't see anything in the vignette about choosing between the options. I believe you will have many individuals who come to this as their first look into proteomics, and it would be good from a completeness standpoint to have 1-2 sentences to guide them to a choice. Otherwise they will use the default or guess.

Beyond a couple additions to the vignette, I have no further requests.

Addendum: After another quick read through, I noticed the parameter singleFit again. It seems like this would be most appropriate to set to True when there is reason to believe that there is a true expression difference between combined groups. Do you agree with that? I believe there should be guidance on when to set that parameter in the vignette and not just a technical definition of what it does. I will also make a ticket to standardize the defaults for that param across functions.

csoneson commented 11 months ago

@AnthonyOfSeattle Thanks for your thorough review and constructive comments! I have made additions to the vignette to adress them, summarized in this PR. Below are responses to your comments:

If a user has a mutlifactor design, say {male, female}X{control, treated}, is the proper way to handle this data to have a single group column with values chosen from {male_control, male_treated, female_control, female_treated}?

Yes, that would be one way (especially if the treatment effect may be gender-specific). Another option could be to consider gender as the batch effect (if treatment is the main factor of interest, and no interaction is expected).

I see the following quote in the section The sample annotation table: This data.frame must have at least two columns, named sample and group, but any additional columns are also supported and will be included in the final SingleCellExperiment object. Are additional columns being included in statistical test as well? Could a quick answer to that question be stated in the vignette, maybe with an explicit statement about the best way to handle multi-factor designs like this?

No, additional columns are not included in the statistical test, but they are propagated to the final SCE. I have added a sentence about this in the vignette, as well as some tips for multi-factor and more complex designs. For the current version of einprot, we made a conscious choice to make the setup of the statistical testing easy (specifically, to not require the explicit specification of a design matrix and contrasts), while attempting to still cover a reasonable range of use cases.

I see two main packages for doing statistical test, both of which seem like good additions. However, I don't see anything in the vignette about choosing between the options.

I added a section about this in the FAQ part in the end of the vignette.

Addendum: After another quick read through, I noticed the parameter singleFit again. It seems like this would be most appropriate to set to True when there is reason to believe that there is a true expression difference between combined groups. Do you agree with that? I believe there should be guidance on when to set that parameter in the vignette and not just a technical definition of what it does.

The main advantage of fitting a single model (singleFit = TRUE) is that a larger number of samples are used to estimate parameters, which usually gives more precise estimates. For this reason, this is typically the recommended approach with limma and other inference pipelines. However, it also involves making assumptions about similarities of variances between groups, and if there are large differences, either fitting separate models or using a weighting approach (which is possible within limma) may be more suitable. I have added a discussion about this in the vignette, together with some links to posts where this is discussed (as it's a general question for any data analyzed with limma).

I will also make a ticket to standardize the defaults for that param across functions.

Thank you - I have standardized the defaults across the different functions (it's now TRUE everywhere).

AnthonyOfSeattle commented 11 months ago

Hi @csoneson, thank you for your thorough answers to my questions. I have read through your PR and think your changes completely address my concerns.

@fboehm, I have completed my checklist and have no further comments for the authors.

fboehm commented 11 months ago

Thank you so much, @AnthonyOfSeattle and @ByrumLab, for the thorough reviews!

@csoneson - the reviewers have recommended the submission for publication. There are a few more steps before we finalize the publication.

fboehm commented 11 months ago

@editorialbot commands

editorialbot commented 11 months ago

Hello @fboehm, here are the things you can ask me to do:


# List all available commands
@editorialbot commands

# Add to this issue's reviewers list
@editorialbot add @username as reviewer

# Remove from this issue's reviewers list
@editorialbot remove @username from reviewers

# Get a list of all editors's GitHub handles
@editorialbot list editors

# Assign a user as the editor of this submission
@editorialbot assign @username as editor

# Remove the editor assigned to this submission
@editorialbot remove editor

# Remind an author, a reviewer or the editor to return to a review after a 
# certain period of time (supported units days and weeks)
@editorialbot remind @reviewer in 2 weeks

# Check the references of the paper for missing DOIs
@editorialbot check references

# Perform checks on the repository
@editorialbot check repository

# Adds a checklist for the reviewer using this command
@editorialbot generate my checklist

# Set a value for version
@editorialbot set v1.0.0 as version

# Set a value for branch
@editorialbot set joss-paper as branch

# Set a value for repository
@editorialbot set https://github.com/organization/repo as repository

# Set a value for the archive DOI
@editorialbot set set 10.5281/zenodo.6861996 as archive

# Mention the EiCs for the correct track
@editorialbot ping track-eic

# Generates the pdf paper
@editorialbot generate pdf

# Recommends the submission for acceptance
@editorialbot recommend-accept

# Generates a LaTeX preprint file
@editorialbot generate preprint

# Flag submission with questionable scope
@editorialbot query scope

# Get a link to the complete list of reviewers
@editorialbot list reviewers

# Creates a post-review checklist with editor and authors tasks
@editorialbot create post-review checklist

# Open the review issue
@editorialbot start review
fboehm commented 11 months ago

Post-Review Checklist for Editor and Authors

Additional Author Tasks After Review is Complete

Editor Tasks Prior to Acceptance

fboehm commented 11 months ago

@csoneson - I plan to read the manuscript and offer suggestions in by the end of tomorrow. I'll comment in this thread once I've completed the proofreading.

csoneson commented 11 months ago

Thanks @fboehm - in the meanwhile I created the release and archived it on Zenodo:

fboehm commented 11 months ago

Thank you, @csoneson ! Something came up for me, so it will be a few days before I can proofread the paper. I apologize for the delay. I'll comment here once I've proofread it.

csoneson commented 11 months ago

Thanks @fboehm, no worries!

fboehm commented 11 months ago

@editorialbot generate pdf

editorialbot commented 11 months ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

fboehm commented 11 months ago

The text of the paper.md is well written and easy to read. I have no suggestions for improvements, and I found no typos.

Peng et al reference has "High-Performance" with capital H and capital P. Is this as it should be? Or should these be lower case h and p?

Xie et al reference has "R markdown" - should this be one word?

fboehm commented 11 months ago

@csoneson - I just have a couple quick questions about the references, which are listed in the comment above. Once you answer them, I'll continue working through the checklist. Thanks again!

csoneson commented 11 months ago

@editorialbot generate pdf

editorialbot commented 11 months ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

csoneson commented 11 months ago

Thanks @fboehm!

Peng et al reference has "High-Performance" with capital H and capital P. Is this as it should be? Or should these be lower case h and p?

You're right - I have made them lower-case.

Xie et al reference has "R markdown" - should this be one word?

I believe it should be two words (based on https://bookdown.org/yihui/rmarkdown/). However, the Markdown should also be capitalized (again, based on the same webpage)

I have fixed these two points πŸ™‚

fboehm commented 11 months ago

Thanks so much, @csoneson!

fboehm commented 11 months ago

@editorialbot set 10.5281/zenodo.8298657 as archive

editorialbot commented 11 months ago

Done! archive is now 10.5281/zenodo.8298657

fboehm commented 11 months ago

@editorialbot set v0.7.7 as version

editorialbot commented 11 months ago

Done! version is now v0.7.7

fboehm commented 11 months ago

@editorialbot generate pdf

editorialbot commented 11 months ago

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

fboehm commented 11 months ago

@editorialbot check references

editorialbot commented 11 months ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1038/nmeth.4256 is OK
- 10.1101/2023.06.26.546625 is OK
- 10.1093/bioadv/vbab041 is OK
- 10.1093/bioinformatics/btaa620 is OK
- 10.1021/acs.jproteome.0c00398 is OK
- 10.1021/acs.jproteome.9b00496 is OK
- 10.1371/journal.pcbi.1009148 is OK
- 10.1093/bioinformatics/btu305 is OK
- 10.1074/mcp.M113.031591 is OK
- 10.1074/mcp.M114.041012 is OK
- 10.1016/j.cell.2015.09.053 is OK
- 10.1083/jcb.200911091 is OK
- 10.1038/nprot.2009.36 is OK
- 10.1074/mcp.RA120.002105 is OK
- 10.3390/proteomes9010015 is OK
- 10.1038/nbt.1511 is OK
- 10.1186/s12864-022-09058-7 is OK
- 10.1101/416511 is OK
- 10.1038/nmeth.3901 is OK
- 10.1016/j.jprot.2020.103669 is OK
- 10.12688/f1000research.14966.1 is OK
- 10.1038/75556 is OK
- 10.1093/genetics/iyad031 is OK
- 10.1093/nar/gky973 is OK
- 10.1093/genetics/iyab222 is OK
- 10.1093/nar/gkn1005 is OK
- 10.1038/nmeth.3252 is OK
- 10.1038/s41592-019-0654-x is OK
- 10.1016/j.molcel.2023.06.001 is OK
- 10.1093/nar/gkac610 is OK
- 10.1021/pr300273g is OK
- 10.1038/nprot.2017.147 is OK
- 10.1021/acs.jproteome.2c00441 is OK
- 10.1016/j.cell.2021.01.004 is OK
- 10.1021/acs.jproteome.2c00390 is OK
- 10.1038/s41586-018-0153-8 is OK
- 10.1371/journal.pcbi.1010752 is OK
- 10.1021/acs.jproteome.2c00812 is OK

MISSING DOIs

- None

INVALID DOIs

- None
fboehm commented 11 months ago

@editorialbot recommend-accept

editorialbot commented 11 months ago
Attempting dry run of processing paper acceptance...
editorialbot commented 11 months ago
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1038/nmeth.4256 is OK
- 10.1101/2023.06.26.546625 is OK
- 10.1093/bioadv/vbab041 is OK
- 10.1093/bioinformatics/btaa620 is OK
- 10.1021/acs.jproteome.0c00398 is OK
- 10.1021/acs.jproteome.9b00496 is OK
- 10.1371/journal.pcbi.1009148 is OK
- 10.1093/bioinformatics/btu305 is OK
- 10.1074/mcp.M113.031591 is OK
- 10.1074/mcp.M114.041012 is OK
- 10.1016/j.cell.2015.09.053 is OK
- 10.1083/jcb.200911091 is OK
- 10.1038/nprot.2009.36 is OK
- 10.1074/mcp.RA120.002105 is OK
- 10.3390/proteomes9010015 is OK
- 10.1038/nbt.1511 is OK
- 10.1186/s12864-022-09058-7 is OK
- 10.1101/416511 is OK
- 10.1038/nmeth.3901 is OK
- 10.1016/j.jprot.2020.103669 is OK
- 10.12688/f1000research.14966.1 is OK
- 10.1038/75556 is OK
- 10.1093/genetics/iyad031 is OK
- 10.1093/nar/gky973 is OK
- 10.1093/genetics/iyab222 is OK
- 10.1093/nar/gkn1005 is OK
- 10.1038/nmeth.3252 is OK
- 10.1038/s41592-019-0654-x is OK
- 10.1016/j.molcel.2023.06.001 is OK
- 10.1093/nar/gkac610 is OK
- 10.1021/pr300273g is OK
- 10.1038/nprot.2017.147 is OK
- 10.1021/acs.jproteome.2c00441 is OK
- 10.1016/j.cell.2021.01.004 is OK
- 10.1021/acs.jproteome.2c00390 is OK
- 10.1038/s41586-018-0153-8 is OK
- 10.1371/journal.pcbi.1010752 is OK
- 10.1021/acs.jproteome.2c00812 is OK

MISSING DOIs

- None

INVALID DOIs

- None
editorialbot commented 11 months ago

:wave: @openjournals/bcm-eics, this paper is ready to be accepted and published.

Check final proof :point_right::page_facing_up: Download article

If the paper PDF and the deposit XML files look good in https://github.com/openjournals/joss-papers/pull/4537, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

Kevin-Mattheus-Moerman commented 10 months ago

@csoneson I am the AEiC on this track and here to help process the last steps. I have check this review, the paper, your repository, and also the archive. All seems in order, so I will now proceed to accept this work in JOSS.

Kevin-Mattheus-Moerman commented 10 months ago

@editorialbot accept

editorialbot commented 10 months ago
Doing it live! Attempting automated processing of paper acceptance...
editorialbot commented 10 months ago

Ensure proper citation by uploading a plain text CITATION.cff file to the default branch of your repository.

If using GitHub, a Cite this repository menu will appear in the About section, containing both APA and BibTeX formats. When exported to Zotero using a browser plugin, Zotero will automatically create an entry using the information contained in the .cff file.

You can copy the contents for your CITATION.cff file here:

CITATION.cff

``` cff-version: "1.2.0" authors: - family-names: Soneson given-names: Charlotte orcid: "https://orcid.org/0000-0003-3833-2169" - family-names: Iesmantavicius given-names: Vytautas orcid: "https://orcid.org/0000-0002-2512-9957" - family-names: Hess given-names: Daniel orcid: "https://orcid.org/0000-0002-1642-5404" - family-names: Stadler given-names: Michael B orcid: "https://orcid.org/0000-0002-2269-4934" - family-names: Seebacher given-names: Jan orcid: "https://orcid.org/0000-0002-7858-2720" contact: - family-names: Soneson given-names: Charlotte orcid: "https://orcid.org/0000-0003-3833-2169" doi: 10.5281/zenodo.8298657 message: If you use this software, please cite our article in the Journal of Open Source Software. preferred-citation: authors: - family-names: Soneson given-names: Charlotte orcid: "https://orcid.org/0000-0003-3833-2169" - family-names: Iesmantavicius given-names: Vytautas orcid: "https://orcid.org/0000-0002-2512-9957" - family-names: Hess given-names: Daniel orcid: "https://orcid.org/0000-0002-1642-5404" - family-names: Stadler given-names: Michael B orcid: "https://orcid.org/0000-0002-2269-4934" - family-names: Seebacher given-names: Jan orcid: "https://orcid.org/0000-0002-7858-2720" date-published: 2023-09-11 doi: 10.21105/joss.05750 issn: 2475-9066 issue: 89 journal: Journal of Open Source Software publisher: name: Open Journals start: 5750 title: "einprot: flexible, easy-to-use, reproducible workflows for statistical analysis of quantitative proteomics data" type: article url: "https://joss.theoj.org/papers/10.21105/joss.05750" volume: 8 title: "einprot: flexible, easy-to-use, reproducible workflows for statistical analysis of quantitative proteomics data" ```

If the repository is not hosted on GitHub, a .cff file can still be uploaded to set your preferred citation. Users will be able to manually copy and paste the citation.

Find more information on .cff files here and here.

editorialbot commented 10 months ago

🐘🐘🐘 πŸ‘‰ Toot for this paper πŸ‘ˆ 🐘🐘🐘

editorialbot commented 10 months ago

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited :point_right: https://github.com/openjournals/joss-papers/pull/4545
  2. Wait a couple of minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.05750
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! πŸŽ‰πŸŒˆπŸ¦„πŸ’ƒπŸ‘»πŸ€˜

Any issues? Notify your editorial technical team...

Kevin-Mattheus-Moerman commented 10 months ago

@csoneson congratulations on this JOSS publication!

Thanks for editing @fboehm !

And a special thanks to the reviewers: @AnthonyOfSeattle, @ByrumLab !!

editorialbot commented 10 months ago

:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.05750/status.svg)](https://doi.org/10.21105/joss.05750)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.05750">
  <img src="https://joss.theoj.org/papers/10.21105/joss.05750/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.05750/status.svg
   :target: https://doi.org/10.21105/joss.05750

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following: