Open editorialbot opened 3 months ago
Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.
For a list of things I can do to help you, just type:
@editorialbot commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@editorialbot generate pdf
Software report:
github.com/AlDanial/cloc v 1.90 T=0.36 s (495.3 files/s, 437204.4 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
CSV 2 0 0 100002
JavaScript 13 4737 4760 17564
HTML 17 1322 51 12167
Python 111 2662 1421 5937
Markdown 11 699 0 1563
CSS 5 304 69 1291
TeX 1 21 0 177
YAML 5 19 27 106
reStructuredText 6 94 183 97
DOS Batch 1 8 1 26
TOML 1 5 1 24
make 1 4 7 9
INI 1 0 0 2
Bourne Shell 1 0 0 1
-------------------------------------------------------------------------------
SUM: 176 9875 6520 138966
-------------------------------------------------------------------------------
Commit count by author:
294 William Song
Paper file info:
📄 Wordcount for paper.md
is 1873
✅ The paper includes a Statement of need
section
License info:
✅ License found: MIT License
(Valid open source OSI approved license)
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1007/978-3-319-31204-0_9 is OK
- 10.1093/gji/ggad446 is OK
- 10.1088/1742-6596/1719/1/012102 is OK
- 10.1109/ICEC.1997.592270 is OK
- 10.1007/s00500-016-2474-6 is OK
- 10.21105/joss.01701 is OK
MISSING DOIs
- No DOI given, and none found for title: Adaptation in Natural and Artificial Systems
- No DOI given, and none found for title: DEAP: Evolutionary Algorithms Made Easy
- 10.1093/bioinformatics/btz470 may be a valid DOI for title: Scaling tree-based automated machine learning to b...
- No DOI given, and none found for title: typing — Support for type hints
- No DOI given, and none found for title: Numpy - The fundamental package for scientific com...
- No DOI given, and none found for title: A review on genetic algorithm: past, present, and ...
- No DOI given, and none found for title: Scikit-learn: Machine Learning in Python
- No DOI given, and none found for title: API design for machine learning software: experien...
- No DOI given, and none found for title: GAMA: A General Automated Machine Learning Assista...
- No DOI given, and none found for title: Interpretable Machine Learning for Science with Py...
- No DOI given, and none found for title: Genetic Algorithm: Reviews, Implementations, and A...
- No DOI given, and none found for title: Evolutionary Optimization Algorithms: Biologically...
- No DOI given, and none found for title: neat-python
- No DOI given, and none found for title: Algebraic structure
INVALID DOIs
- None
Some comments on the manuscript. I plan to follow up shortly on the software content.
Remove “These authors contributed equally” for one author.
add a space in “programming(OOP)”
S[ ]
with formal mathematical notation is a bit confusing.— the twin y axis on right isn’t labeled — caption does not need to mention that plotting algorithms are not included in the library
Because visualization isn’t a core feature of the library, consider rewriting the visualization section to exclude the plotting and focus on history extraction. The majority of the example code in this section is generic boilerplate matplotlib.
In closing sentence,
_fitness
should be mentioned in the accompanying text. The override of the // operator to denote size should also similarly explicitly explained.Hi @Freakwill,
@mmore500 has made some comments/requests about your paper ~2 weeks ago (see https://github.com/openjournals/joss-reviews/issues/6575#issuecomment-2071073010)
Would it help if I transcribed these bullet points as checklists and register them as issues in your project repository?
Ran into an install issue, will resume my review once that is unblocked! (https://github.com/Freakwill/pyrimidine/issues/4)
For the record, I have e-mailed the project author to make sure that the communication channels (this JOSS issue as well as the project issue) with him are open and asked him to try to publicly acknowledge the reviewer messages (even if the issue can't be adressed right away).
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Some comments on the manuscript. I plan to follow up shortly on the software content.
- there are several broken references (search the manuscript for “??”)
I use the following cross-references according to the docs of joss. But why dose it result in ??
?
A concise comparison between `pyrimidine` and several popular frameworks provided in \autoref{frameworks}, such as ....
: Comparison of the popular genetic algorithm frameworks. []{label="frameworks"}
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot commands
Hello @Freakwill, here are the things you can ask me to do:
# List all available commands
@editorialbot commands
# Get a list of all editors's GitHub handles
@editorialbot list editors
# Adds a checklist for the reviewer using this command
@editorialbot generate my checklist
# Set a value for branch
@editorialbot set joss-paper as branch
# Run checks and provide information on the repository and the paper file
@editorialbot check repository
# Check the references of the paper for missing DOIs
@editorialbot check references
# Generates the pdf paper
@editorialbot generate pdf
# Generates a LaTeX preprint file
@editorialbot generate preprint
# Get a link to the complete list of reviewers
@editorialbot list reviewers
@editorialbot check repository
Software report:
github.com/AlDanial/cloc v 1.90 T=0.42 s (500.0 files/s, 411662.9 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
CSV 2 0 0 100002
HTML 34 2670 102 24377
JavaScript 18 4853 4929 18267
Python 115 2788 1490 6169
CSS 7 554 114 2269
Markdown 11 740 0 1611
reStructuredText 10 171 328 185
TeX 1 18 0 173
YAML 5 19 27 106
DOS Batch 2 16 2 52
TOML 1 5 1 24
make 1 4 7 9
INI 1 0 0 2
Bourne Shell 1 0 0 1
-------------------------------------------------------------------------------
SUM: 209 11838 7000 153247
-------------------------------------------------------------------------------
Commit count by author:
315 William Song
1 Matthew Andres Moreno
Paper file info:
📄 Wordcount for paper.md
is 2091
✅ The paper includes a Statement of need
section
License info:
✅ License found: MIT License
(Valid open source OSI approved license)
@editorialbot generate pdf
:warning: An error happened when generating the pdf.
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
I use the following cross-references according to the docs of joss. But why dose it result in
??
?A concise comparison between `pyrimidine` and several popular frameworks provided in \autoref{frameworks}, such as .... : Comparison of the popular genetic algorithm frameworks. []{label="frameworks"}
I see that you apparently still have some weird formatting in this line, I'll have a look.
AFAICT the autoref{eq:container}
issue should be solved if you use the syntax
\begin{equation} \label{eq:container}
s = \{a:A\}: S \quad \text{or} \quad s:S[A]
\end{equation}
autoref{history}
already works as expected :+1:
And with respect to autoref{frameworks}
, the caption
: Comparison of the popular genetic algorithm frameworks. \label{frameworks}
shoud appear after the table, not before. These are the three uses of autoref
that I can find in the paper.
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Submitting author: !--author-handle-->@Freakwill<!--end-author-handle-- (Congwei Song) Repository: https://github.com/Freakwill/pyrimidine Branch with paper.md (empty if default branch): Version: v1.5.4 Editor: !--editor-->@boisgera<!--end-editor-- Reviewers: @mmore500, @sjvrijn Archive: Pending
Status
Status badge code:
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@mmore500 & @sjvrijn, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @boisgera know.
✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨
Checklists
📝 Checklist for @mmore500
📝 Checklist for @sjvrijn