Closed editorialbot closed 1 year ago
Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.
For a list of things I can do to help you, just type:
@editorialbot commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@editorialbot generate pdf
Software report:
github.com/AlDanial/cloc v 1.88 T=0.09 s (1302.5 files/s, 83740.1 lines/s)
--------------------------------------------------------------------------------
Language files blank comment code
--------------------------------------------------------------------------------
Java 102 800 1336 4257
SVG 2 1 52 284
Maven 1 8 6 193
TeX 1 0 0 166
Markdown 3 57 0 161
Bourne Again Shell 1 0 0 2
Bourne Shell 1 0 0 2
JSON 2 0 0 2
make 1 0 0 2
--------------------------------------------------------------------------------
SUM: 114 866 1394 5069
--------------------------------------------------------------------------------
gitinspector failed to run statistical information for the repository
Wordcount for paper.md
is 1388
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1007/978-3-319-66158-2_34 is OK
- 10.1137/1.9781611975673.15 is OK
- 10.24963/ijcai.2019/149 is OK
- 10.1016/j.artint.2011.05.002 is OK
- 10.24963/ijcai.2022/261 is OK
- 10.21105/joss.04708 is OK
- 10.1007/978-3-030-67658-2_3 is OK
- 10.1093/bioinformatics/btn490 is OK
- 10.3390/e18050164 is OK
- 10.1016/j.artint.2015.04.003 is OK
- 10.1007/978-3-319-44953-1_22 is OK
- 10.1002/widm.1207 is OK
- 10.14257/ijhit.2013.6.6.30 is OK
MISSING DOIs
- None
INVALID DOIs
- None
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@skadio and @jgFages - Thanks for agreeing to review this submission. This is the review thread for the paper. All of our communications will happen here from now on.
As you can see above, you each should use the command @editorialbot generate my checklist
to create your review checklist. @editorialbot commands need to be the first thing in a new comment.
As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.
The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews#5654
so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.
We aim for reviews to be completed within about 2-4 weeks. Please let me know if either of you require some more time. We can also use editorialbot (our bot) to set automatic reminders if you know you'll be away for a known period of time.
Please feel free to ping me (@danielskatz) if you have any questions/concerns.
reminder to self: @skadio said when accepting this review "I might take some time (~8 weeks)."
Data Mining with Constraint Programming is indeed an interesting topic of Research. The paper is well written, and well illustrated. The literature review is good and the tool is well motivated.
I have however a few remarks regarding both the paper and the software :
Minor : data mining to Constraint Programming -> Data Mining to Constraint Programming
@chaver congratulations on your IJCAI'22 paper and turning it into a library.
This submission is clearly relevant to JOSS and almost there in terms of publication, with a few suggestions that needs to be accounted for:
typo in summary: "task ," remove extra space
The readme needs a major update. Luckily you have all that information in the WIKI and the Paper
The current readme is not in sync with the wiki.
A user looking at the Readme would not understand that this is a "library" which was my main confusion initially. It almost seems like a standalone command line tool, which is not the case here, it is indeed a library. Even worse, it first looked like a repo to me to reproduce the experiments from the IJCAI'22 paper.
Compared to that; the Wiki does a much better job. https://gitlab.com/chaver/data-mining/-/wikis/home
My recommendation is creating a lean / subset of the Wiki to replace the current Readme. You can cite the IJCAI paper at top but push the commentary about reproducing experiments to the bottom (or better, leave in a section in the detailed Wiki).
My major concern about the paper is: I am not sure if an outsider would immediately understand what are these constraints are. Currently the in the Readme, the constraints are listed under "Installation" which is not right. My recommendation is: create a specific section in the README for this and mimic what's in the Wiki. I almost tempted to say use the Wiki as the Readme.
The paper does an adequate job of description the tasks (like, what is skypattern mining) but the Readme and Wiki don't do that.
My overall comment is:
where the ultimate goal is, once this is published in JOSS, an informed outsider (not a pattern mining expert) can leverage your library. You have all this material across the Paper, Readme, and Wiki but need to refresh the Readme and Wiki with a lens to cater to an outsider. The example in the paper is great so I would highly suggest adding a "Quick Start Example" to the very top of the README.
state-of-the-field + references: please also mention the Seq2Pat library which is immediately relevant for declarative (sequential) pattern mining cite its relevant references from AI Mag'23 and AAAI'22. You might also consider citing the dichotomic pattern mining framework from Frontiers'22. These are listed here https://github.com/fidelity/seq2pat#citation
I would also argue for taking that paper folder "outside" of this library. You can have your reproducibility repo in standalone and use your own library from there to show/reproduce the results. Then, you can link to that paper repo from the references here. As a "user" of the "library" I am not immediately interested in paper reproducing scripts. Notice also, down the road, your reproducibility scripts needs to stay identical whereas the library will move.
Hope this helps you improve your submission.
Serdar
Happy to take another quick look once you update the library
@editorialbot commands
Hello @ChaVer, here are the things you can ask me to do:
# List all available commands
@editorialbot commands
# Get a list of all editors's GitHub handles
@editorialbot list editors
# Check the references of the paper for missing DOIs
@editorialbot check references
# Perform checks on the repository
@editorialbot check repository
# Adds a checklist for the reviewer using this command
@editorialbot generate my checklist
# Set a value for branch
@editorialbot set joss-paper as branch
# Generates the pdf paper
@editorialbot generate pdf
# Generates a LaTeX preprint file
@editorialbot generate preprint
# Get a link to the complete list of reviewers
@editorialbot list reviewers
@editorialbot check references
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1007/978-3-319-66158-2_34 is OK
- 10.1137/1.9781611975673.15 is OK
- 10.24963/ijcai.2019/149 is OK
- 10.1016/j.artint.2011.05.002 is OK
- 10.24963/ijcai.2022/261 is OK
- 10.21105/joss.04708 is OK
- 10.1007/978-3-030-67658-2_3 is OK
- 10.1093/bioinformatics/btn490 is OK
- 10.3390/e18050164 is OK
- 10.1016/j.artint.2015.04.003 is OK
- 10.1007/978-3-319-44953-1_22 is OK
- 10.1002/widm.1207 is OK
- 10.14257/ijhit.2013.6.6.30 is OK
MISSING DOIs
- 10.1609/aaai.v36i11.21542 may be a valid DOI for title: Seq2Pat: Sequence-to-Pattern Generation for Constraint-based Sequential Pattern Mining
- 10.3389/frai.2022.868085 may be a valid DOI for title: Dichotomic Pattern Mining Integrated with Constraint Reasoning for Digital Behaviour Analyses
INVALID DOIs
- https://doi.org/10.1002/aaai.12081 is INVALID because of 'https://doi.org/' prefix
@editorialbot check references
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1007/978-3-319-66158-2_34 is OK
- 10.1137/1.9781611975673.15 is OK
- 10.24963/ijcai.2019/149 is OK
- 10.1016/j.artint.2011.05.002 is OK
- 10.24963/ijcai.2022/261 is OK
- 10.21105/joss.04708 is OK
- 10.1007/978-3-030-67658-2_3 is OK
- 10.1093/bioinformatics/btn490 is OK
- 10.3390/e18050164 is OK
- 10.1016/j.artint.2015.04.003 is OK
- 10.1007/978-3-319-44953-1_22 is OK
- 10.1002/widm.1207 is OK
- 10.14257/ijhit.2013.6.6.30 is OK
- 10.1002/aaai.12081 is OK
- 10.1609/aaai.v36i11.21542 is OK
- 10.3389/frai.2022.868085 is OK
MISSING DOIs
- None
INVALID DOIs
- None
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Dear reviewers,
Thank you for your valuable comments and suggestions for improving both the paper and the library.
@jgFages :
@skadio :
@ChaVer thank you for the updates! The readme is much improved now, and better reflects the power of the tool to an interested user now. Same for the manuscript.
My only final comment would the order of sections in the readme. Currently, it starts with the Architecture, Installation, and then the Illustrative Example, and Documents. Why not bring the Example to the top? So in my mind, you have the intro (as a reader I am thinking, what does this tool do?), the example (show me how it looks), installation (example is cool, how can I use this?), and documentation (where do I go to with questions and more details). Small change, but I think it would help you.
@danielskatz this is good to go from my side.
@jgFages - I look forward to your thoughts when you get a chance, and thanks to @skadio for yours
@ChaVer Thank you for your answer and for the good work.
There is still one propagator to rename : Overlap -> PropOverlap.
Also the library is named Choco-Mining but the project repo and artifact is data-mining, which can be confusing. I would suggest to rename it choco-mining (maybe after the discussion with @cprudhom).
@skadio I moved the illustrative example to the top of the README as suggested. Thanks for your help!
@jgFages:
👋 @skadio and @jgFages - Any comments on this? I see that both of you have complete checklists, which might mean that you are satisfied with this to be accepted, but I would appreciate if you could confirm this, or let me know what else you think is needed.
@ChaVer - if you are changing the repo name, please tell me here
Regarding the review process it is fine on my side. I just don't know whether or not we should wait for the decision regarding the repo name / artifact name (I guess we should).
Additionnal remark for @ChaVer : regarding the Overlap, I get your point, but then I have a few suggestions :
@danielskatz confirm that the submission is good to go from my end!
I agree with @jgFages that we should wait for the decision on the repo/artifact name before completing the review process.
Regarding the overlap constraint, I think your idea is very relevant, so I have updated the code accordingly. Now the user can simply invoke the constraint with the following block of code:
Overlap overlap = ConstraintFactory.overlap(database, x, jmax, theta);
overlap.post()
The monitor is automatically plugged to the solver during the Overlap object creation.
The user can retrieve the history of itemsets after solving the model with this line of code:
List<int[]> itemsets = overlap.getItemsetsHistory();
ok, I'll keep waiting until there's a decision on the possible name change
@ChaVer - any news on the name?
@danielskatz I got a positive answer from @cprudhom so I renamed the repository and the projet artifact to choco-mining. The new repo url is https://gitlab.com/chaver/choco-mining.
@editorialbot set https://gitlab.com/chaver/choco-mining as repository
Done! repository is now https://gitlab.com/chaver/choco-mining
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@ChaVer - I think this change has been made. Does anything in the paper need to change?
@danielskatz the paper does not need to change, I think that we can go ahead, thank you !
@ChaVer - At this point could you:
I can then move forward with proofreading and then recommending acceptance of the submission.
@danielskatz :
@editorialbot set v1.0.2 as version
Done! version is now v1.0.2
@editorialbot set 10.5281/zenodo.8263971 as archive
That doesn't look like a valid DOI value
@ChaVer - https://doi.org/10.5281/zenodo.8263971 does seem to work. Can you share the URL to the archive so that I can take a look at it that way for now?
@danielskatz the url of the archive is https://zenodo.org/record/8263971
I guess there's something slow in zenodo/datacite today - I'll check again in a few hours
@editorialbot set 10.5281/zenodo.8263971 as archive
Done! archive is now 10.5281/zenodo.8263971
Submitting author: !--author-handle-->@ChaVer<!--end-author-handle-- (Charles Vernerey) Repository: https://gitlab.com/chaver/choco-mining Branch with paper.md (empty if default branch): Version: v1.0.2 Editor: !--editor-->@danielskatz<!--end-editor-- Reviewers: @skadio, @jgFages Archive: 10.5281/zenodo.8263971
Status
Status badge code:
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@skadio & @jgFages, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @danielskatz know.
✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨
Checklists
📝 Checklist for @jgFages
📝 Checklist for @skadio