Closed whedon closed 3 years ago
Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @stsievert, @torressa it looks like you're currently assigned to review this paper :tada:.
:warning: JOSS reduced service mode :warning:
Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.
:star: Important :star:
If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿
To fix this do the following two things:
For a list of things I can do to help you, just type:
@whedon commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@whedon generate pdf
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1007/0-306-48056-5_11 is OK
- 10.1109/TEVC.2007.892759 is OK
- 10.1016/j.cor.2006.02.008 is OK
- 10.1109/TEVC.2013.2239648 is OK
- 10.1007/s00158-004-0465-1 is OK
- 10.1109/ICMLA.2007.35 is OK
- 10.3390/rs10071117 is OK
MISSING DOIs
- None
INVALID DOIs
- None
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
👋🏼 @jbuisine @stsievert @torressa this is the review thread for the paper. All of our communications will happen here from now on.
Both reviewers have checklists at the top of this thread with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.
The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews#REVIEW_NUMER
so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.
We aim for reviews to be completed within about 2-4 weeks. Please let me know if any of you require some more time (@torressa has already done so). We can also use Whedon (our bot) to set automatic reminders if you know you'll be away for a known period of time.
Please feel free to ping me (@melissawm) if you have any questions/concerns.
:wave: @stsievert, please update us on how your review is going.
:wave: @torressa, please update us on how your review is going.
I'll try to have a review in by this weekend.
@whedon generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@whedon commands
Here are some things you can ask me to do:
# List Whedon's capabilities
@whedon commands
# List of editor GitHub usernames
@whedon list editors
# List of reviewers together with programming language preferences and domain expertise
@whedon list reviewers
EDITORIAL TASKS
# Compile the paper
@whedon generate pdf
# Compile the paper from alternative branch
@whedon generate pdf from branch custom-branch-name
# Ask Whedon to check the references for missing DOIs
@whedon check references
# Ask Whedon to check repository statistics for the submitted software
@whedon check repository
@melissawm I have a review waiting. It looks like I didn't accept the invitation to collaborate on this repo in time, so I can't edit https://github.com/openjournals/joss-reviews/issues/2812#issue-734575770. Could you resend that invitation?
Sure, no problem!
@whedon re-invite @stsievert as reviewer
OK, the reviewer has been re-invited.
@stsievert please accept the invite by clicking this link: https://github.com/openjournals/joss-reviews/invitations
My initial review: I do not believe this software project meets the JOSS review requirements. I welcome questions and comments from @jbuisine so provide clarification or be corrected on points below.
The most significant comments/questions are below. I have also edited https://github.com/openjournals/joss-reviews/issues/2812#issue-734575770 to check various boxes* and add some notes on specific items.
The documentation has some specific problems: it's very general and not detailed, is missing a good API description.
IteratedLocalSearch
is basically "[this class is] used to avoid local optima and increave EvE (Exploration vs Exploitation) compromise". What's "EvE", what effect does it have, and how do the various parameters balance it?C
and exp_rate
do in UCBPolicy? Does increasing C
increase or decrease the importance of exploration? How does changing the value of exp_rate
change the exploration/exploitation tradeoff?* a checked box ([x]) means "satisfied", an unchecked box ([ ]) means "not reviewed" and a crossed box ([ ]) means "not satisfied". A partially satisfied requirement is indicated with "[.]"
Hi @jbuisine thanks for the submission! Here's my review, I welcome responses and questions in any of the following points. Also, I may have more comments in the future.
Documentation: bad.
Testing: bad. Unit tests missing (there is one test in the setup.py file)
LICENSE not completed.
Possibly not a substantial scholarly effort
Software paper:
macop/evaluators
: contains a single example and not an implementationmacop/operators
: triple nested inheritance makes it hard to determine where functions are actually being implemented (e.g. UCBPolicy
inherits from Policy
which inherits from Operator
). For such simple functionality, e.g definining the method apply
, this seems a bit overkill. Additionally, inheritance is not used e.g. macop/operators/policies/UCBPolicy.py
could be a standalone class.macop/solutions/
: Same as above, inheritance doesn't do much as functions are overloaded and arguments not passed to the parent. The purpose of these functions is really not clear to me._
for all input parameters, this is common practice for private parameters.macop/callbacks
: they are technically a callback, but it seems more like a custom loggermacop/
should contain module imports Thank you both @stsievert and @torressa for the considerations. @jbuisine do you think this is something you can work on with the reviewers comments? Please let me know in case any of you need clarifications or more information.
Thanks to @stsievert and @torressa for your remarks. @melissawm Yes it is something I can work on. I will see to take into consideration each of the remarks and update the Python package accordingly (documentation, examples, structural problems...). I will also take the time to answer each of your questions and queries about the package during the week.
Hello, all! Any updates, @jbuisine ?
Hello @melissawm, I will work on it very soon! Sorry for the delay of the updates
Hello,
I wanted to inform you that I have been working on updating the package and all the answers for some time. Everything will be available in the next 2 days. I am really sorry for the delay.
Hello @melissawm, @stsievert, @torressa
First of all, I would like to apologise for the delay in responding to your comments. I would like to thank you for your very good remarks, I have modified the document (paper), the package and the documentation accordingly. Please find below my comments in relation to your various questions/comments.
The comments to the different questions are answered for each reviewer in two sections:
According to a certain recommendation, the answer to reviewers is presented as follows:
[RV] The tests are not sufficient. I only found a single test in setup.py, and not a single doctest as claimed. In this one test, there are many untested classes.
[AA] Updates have been done following this comment using doctest in order to better cover the code.
[RV] Target audience and statement of need. The main documentation do not identify a target audience or a statement of need. Only one feature of the software, maximum flexibility, is described. Who would want that and why? In paper.md, I see that there's a need for maximum flexibility mentioned in the paper, but why would a user want if they're not doing "thesis work"?
[AA] The proposed Macop package is now specified as being oriented towards discrete optimization. It therefore targets anyone wishing to respond to a discrete optimization problem initially.
Macop wants to allow users to quickly focus on one of proposed features from its interaction main loop.
[RV] The documentation has some specific problems: it's very general and not detailed, is missing a good API description.
[AA] The proposed Macop package is now specified as being oriented towards discrete optimization. However, it remains open to extension for continued optimization. Its main objective is to offer a common interaction loop and to allow users of the package to focus their developments on one of the parts in question. It therefore targets anyone wishing to respond to a discrete optimization problem initially.
All documentation has been updated. It is possible to understand the full features interaction and main behaviour of the package with a complete example. A more complete API is also available.
[RV] Installation: Specifically, this claim is made in paper.md: "Solutions modeling continuous problems can also be created by the anyone who wants to model his own problem", but I'm not seeing a continuous optimization problem in the examples or tests.
[AA] The proposed Macop package is now specified as being oriented towards discrete optimization. However, it remains open to extension for continued optimization but it was not dealt with.
[RV] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
[AA] Macop now targets anyone wishing to respond to a discrete optimization problem.
[RV] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
[AA] All documentation has been updated. It is possible to understand the full features interaction and main behaviour of the package. A more complete API is also available.
[RV] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
Yes, but these tests are not sufficient; some modules are untouched. See below for details.
[AA] Updates have been done following this comment using doctest in order to better cover the whole code.
[RV] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Yes, there's some guidelines for contributing code/PRs but it's not super clear.
Yes, there is some guidelines for filing issues/bug reports but it only says "submit an issue with label ...".
There really aren't any guidelines for seeking support past "do not hestitate" when contacting.
I think this could be a lot clearer; I'm leaving this requirement unchecked.
[AA] Some improvements have been added in order to make it clearer how it is possible to contribute to the package.
[RV] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided? Not really. It claims to solve every optimization problem; that seems a little to general. I don't know what job/data would motivate using this software.
[AA] The proposed Macop package is now specified as being oriented towards discrete optimization.
[RV] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
[AA] Details have been made in the paper.md document and also in the package documentation.
[RV] State of the field: Do the authors describe how this software compares to other commonly-used packages? Marginally – I think this requirement is partially satisfied. The entirety of the comparison is "Available libraries in the literature did not allow this kind of implementation of an evaluation function quickly"
[AA] Citations have been updated to better situated where Macop.
[RV] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)? See below for specific edits.
[AA] The document has been updated with respect to the various remarks proposed.
[RV]
[AA] All documentation has been updated. It is possible to understand the full features interaction and main behaviour of the package. A more complete API is also available.
[RV] bad. Unit tests missing (there is one test in the setup.py file)
[AA] Updates have been done following this comment using doctest in order to better cover the whole code.
[RV] not completed.
[AA] Updates have been done in order to well complete the License.
Possibly not a substantial scholarly effort
[RV]
[AA] It is true that the development of the package was done over periods rather than on continuity. Updates to the package have also been offered in the current latest version.
[RV]
State of the field: there is no mention of other similar software packages. The author comments on well-known general optimisation packages (which need a proper citation), but nothing related to the methods used in Macop. In a quick google-search I found a more closely related repo: Solid which provides multiple derivate-free algorithms with user-defined objectives. The author probably knows more about this but I suspect there must be others, either way, it should be stated in the paper.
Quality of writing: the paper needs rereading to correct minor grammatical / orthographical errors. I will provide a list once the other comments have been addressed.
[AA] Updates have been done regarding these comments in order to better situate Macop in litterature.
[RV]
Software usage: the usage of library is not very clear. How does one extract the final solution without looking at the logs? Not the user defined criteria like the score/fitness, but the final solution (i.e. in the knapsack example, the items in the knapsack).
Why is abstract class inheritance only sometimes used? This seems like the right type of inheritance for all the classes.
macop/evaluators
: contains a single example and not an implementation
macop/operators
: triple nested inheritance makes it hard to determine where functions are actually being implemented (e.g. UCBPolicy inherits from Policy which inherits from Operator). For such simple functionality, e.g definining the method apply, this seems a bit overkill. Additionally, inheritance is not used e.g. macop/operators/policies/UCBPolicy.py could be a standalone class.
macop/solutions/
: Same as above, inheritance doesn't do much as functions are overloaded and arguments not passed to the parent. The purpose of these functions is really not clear to me.
In general, the module like structure is really confusing, specially given the small and simple implementations. This makes the imports list to extend a fair bit. I understand that this structure may have been laid out for additional functionality to be introduced. However, if no more work is to be done, to help people understand the code and encourage contributions I suggest a major rework of the code to simplify the structure.
[AA] Many updates have been done in term of code:
macop.evaluators
: is now composed of a class and not just a function. Evaluator is composed of the required data of the problem instance.macop.operators
: Operator is basic abstract class, Mutator and Crossover inherit from Operator. Then implementation of Mutator and Crossover are available. UCBPolicy only inherite from Policy abstract class.macop.solutions
: abstract method is now specified to better understand expected inheritance behaviour.[RV] Please get rid of the build files for the documentation. readthedocs can build the docs on commits to master if you set it up.
[AA] I now use the classical gh-pages
branch to automatically update the doc using CI.
[RV] Relative imports are discouraged by PEP 328 "With the shift to absolute imports, the question arose whether relative imports should be allowed at all".
[AA] Updates have been done.
[RV] Remove leading _ for all input parameters, this is common practice for private parameters.
[AA] Updates have been done. Instance attributes now possessed the leading _
.
[RV] macop/callbacks
: they are technically a callback, but it seems more like a custom logger
[AA] Yes it's true, but it enables to reload the algorithm state if interuption.
[RV] Is there any way to disable the progress output being printed to screen? I can imagine this needs disabling in normal applications
[AA] verbose
attribute to algorithm is now enable and disable the progress.
[RV] The knapsack is a nice an simple example, however, I don't think it showcases the best use of this library. As it takes longer to arrive at the optimal solution with 30 elements with Macop than traditional exact methods.
[AA] It is true that the knapsack is a very classic example and whose solutions can be quickly found. However, I wanted to keep it as an example as well in the documentation due to its simplicity. I will add examples of other instances of problems later.
[RV] In a standard python package, the init file in the main directory is not needed. init file in macop/ should contain module imports
[AA] Updates have been done into __init__.py
main file of macop
.
[RV] Contributions file is really long while explanations on how to contribute are short.
[AA] I tried to update it to make it a little more clearer.
[RV] Examples docs page is really long (half of it is imports and multi-line comments)
[AA] I update the whole documentation with sections to describe the package usages.
[RV] In README:
[AA] Some updates have been done in README.md but not sure it was the expected things.
I do not fill this part because all comments are answered above.
@whedon generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Hi @jbuisine! Thanks for the detailed response and the efforts on the package. I am much happier now with the work you've done so I'm ticking off some of the requirements. Also, really good work on the documentation and diagrams, it looks really nice.
I'll take some time over the week end to come back with my feedback.
Hi again, Here's my feedback
Thanks taking the comments on board and standardising the inheritance structure. The imports look nicer now. I like that you've included placeholders for future implementation for the continuous case.
Only one more thing:
Is it possible to have a more complex example where you have used macop? I like the knapsack example in the docs, but it would be good to have an advanced usage example. If you are using for something in your research, maybe a minimal working example would be good. I've noticed the noise-detection repo, maybe link that it the Macop README.
Minor comments
macop.algorihtms
. Also, I would include this information in the README.macop.solutions
, macop.evaluators
,...) for completeness.Hi @toressa,
Thank you very mush for your answer.
Regarding the code, I really appreciate your comments. I can indeed propose a more complex example (or even several).
I can use QAP (quadratic assignment problem) or UBQP (unconstrained binary quadratic programming) instances. Instances are proposed on mocobench, but I may stay for the examples in a single-objective context.
In my work on noise detection, this would remain more complicated to explain, in reality, I use a parent algorithm with the real (but very expensive) evaluation function, and then local searches which use a surrogate model (a model that has learned to approximate the real cost function and with a quick-to-evaluate function). I'm afraid this is too far from the generic aspect of the package. But I specified in the paper the interest of the proposed package for this feature (which I specify later in this comment).
So if it suits you, I can add in the documentation some examples on a QAP and UBQP instance.
Thank you again for your return and usefull comment.
@whedon generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Thanks for the fast corrections.
So if it suits you, I can add in the documentation some examples on a QAP and UBQP instance.
That sounds good, would be a great addition!
@whedon commands
Here are some things you can ask me to do:
# List Whedon's capabilities
@whedon commands
# List of editor GitHub usernames
@whedon list editors
# List of reviewers together with programming language preferences and domain expertise
@whedon list reviewers
EDITORIAL TASKS
# Compile the paper
@whedon generate pdf
# Compile the paper from alternative branch
@whedon generate pdf from branch custom-branch-name
# Ask Whedon to check the references for missing DOIs
@whedon check references
# Ask Whedon to check repository statistics for the submitted software
@whedon check repository
@whedon check references
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1007/0-306-48056-5_11 is OK
- 10.1109/TEVC.2007.892759 is OK
- 10.1016/j.cor.2006.02.008 is OK
- 10.1109/TEVC.2013.2239648 is OK
- 10.1007/s00158-004-0465-1 is OK
- 10.1109/ICMLA.2007.35 is OK
- 10.3390/rs10071117 is OK
- 10.1007/s00158-011-0666-3 is OK
- 10.1007/978-3-319-42432-3_37 is OK
- 10.1145/3321707.3321800 is OK
MISSING DOIs
- None
INVALID DOIs
- None
Hello @stsievert - would you be able to take a second look at the paper now that @jbuisine has responded to comments? Let me know if I can help in any way. Thanks!
Hello @stsievert, @torressa,
I made some updates:
Do not hesitate to let me know if I need to change anything about the paper.
Thanks for the edits @jbuisine. The documentation is certainly much more descriptive. My comments this time are much smaller; in general, my comments are mostly suggestions. However, I have one blocking comments:
I am still not finished with the review; I need to look closer at the implementation. I also have comments on items that technically meet the requirements:
I have some more specific notes/style nits:
solution.{_data, _algo}
private? Shouldn't it be public? They're both in the tests and the documentation, which is an indication they should be public._build
can be removed from the URL; I think you can specify the build location with Sphinx. I'd recommend changing https://jbuisine.github.io/macop/_build/html/documentations/index.html
to https://jbuisine.github.io/macop/html/documentations/index.html
or https://jbuisine.github.io/macop/documentations/index.html
.macop.policies.reinforcement.UCBPolicy
, does larger C
/exp_rate
mean more or less exploration? What are typical values?evaluator
method takes an object of type Evaluator
. I can't easily navigate to see that page.macop.algorithms.mono
: why are _score
and _data
private variables? If they're mentioned in the docs and tested in the docstrings, I think they should be public (without the underscore).macop.algorihtms
in header.# policy import
and # initializer function with lambda function
to be meaningless; that doesn't provide any new information -- initializer = lambda x=20: ...
tells me the same information. (I'm looking at https://jbuisine.github.io/macop/_build/html/macop/macop.algorithms.mono.html). I can't read that example because of the comments; don't docstrings allow newlines to separate the code into "paragraphs"?Hello @stsievert,
Thank you very much for your remarks. I updated the package consequently, here my answers on each suggested points:
[RV]
[AA] I have added the support seeking information and renamed the contribution file correctly.
[RV]
[AA]
I add the motivation of the package in description of the documentation. I was indeed inspired by the approach proposed for the Dask software. I also add this part into the paper file.
[RV]
[AA]
I have updated the documentation by completing the comments on instantiation methods. In addition, I have added typing the data
parameter of the Evaluator class as a dict
as advised.
[RV]
[AA]
Indeed, the proposed tutorial allows a complete tour of the use of the Macop's interaction loop. For that, I made it quite complete. As you specified, now there are many faster examples on adding and processing specific problem instances. They are therefore faster to implement and require less time.
In documentation, API links to module have also been added.
[RV]
[AA]
I updated the documentation consequently and keep the whole "tutorial" in the single with complete menu in left sidebar.
[RV]
solution.{_data, _algo}
private? Shouldn't it be public? They're both in the tests and the documentation, which is an indication they should be public.[AA]
I have modified the scope of the variables if needed or added accessors/setters if needed. I think it will indeed be more logical to have the code partitioned in the current way.
[RV]
[AA]
Indeed, I updated the title into "Tour of Macop", as we take a complete tour of how the package works.
[RV]
[AA]
I know this is a bit redundant, but it was in the idea of allowing the user following this tutorial to have access to the code and the evolution of the one with the knapsack problem.
[RV]
[AA]
I don't really understand what you want to say. Type annotations depending of solutions obtained?
[RV]
_build
can be removed from the URL; I think you can specify the build location with Sphinx. I'd recommend changing https://jbuisine.github.io/macop/_build/html/documentations/index.html
to https://jbuisine.github.io/macop/html/documentations/index.html
or https://jbuisine.github.io/macop/documentations/index.html
.[AA]
I confess that I haven't found a way to modify this part either on the side of Github or Sphinx. Sphinx allows in its Makefile to specify the output folder but that seems to be all (hence the _build
output).
[RV]
macop.policies.reinforcement.UCBPolicy
, does larger C
/exp_rate
mean more or less exploration? What are typical values?[AA]
Comments macop.policies.reinforcement.UCBPolicy
have been added in order to explain the interest of these parameters with example values and their impacts.
[RV]
macop.algorithms.mono.HillClimberBestImprovement
says the evaluator method takes an object of type Evaluator
. I can't easily navigate to see that page.[AA]
In documentation, API links to module have also been added in order to give a better navigration.
[RV]
[AA]
This is intended to make it easier for users already wishing to implement continuous optimization problems.
[RV]
macop.algorithms.mono
: why are _score
and _data
private variables? If they're mentioned in the docs and tested in the docstrings, I think they should be public (without the underscore).[AA]
I have modified the scope of the variables if needed or added accessors/setters if needed. I think it will indeed be more logical to have the code partitioned in the current way.
[RV]
macop.algorihtms
in header.[AA]
Typo issue fixed into paper.
[RV]:
# policy import
and # initializer function with lambda
function to be meaningless; that doesn't provide any new information -- initializer = lambda x=20: ...
tells me the same information. (I'm looking at https://jbuisine.github.io/macop/_build/html/macop/macop.algorithms.mono.html). I can't - - read that example because of the comments; don't docstrings allow newlines to separate the code into "paragraphs"?[AA]
I have updated some of the comments and get a better visibility for doctest examples.
Do not hesitate to let me know if some of my answers are not correct. Thanks.
My replies are below. The documentation is looking a lot better.
I add the motivation of the package in description of the documentation. I was indeed inspired by the approach proposed for the Dask software. I also add this part into the paper file.
Thank you. It's a lot better now. It'd be nice to have a specific example of a use case (but that's not critical).
the documentation for macop.algorithms.mono.HillClimberBestImprovement says the evaluator method takes an object of type Evaluator. I can't easily navigate to see that page. ...
One example: I mean that I can't easily navigate from macop.algorithms.mono.HillClimberFirstImprovment to macop.evaluators.base, even thought it's a returned by HillClimberFirstImprovment.evaluator
.
I think this is possible with ref:`~macop.evaluators.base.Evaluator`
in Sphinx.
Why is solution.{_data, _algo} private? Shouldn't it be public? They're both in the tests and the documentation, which is an indication they should be public.
I've seen your solution. This code is a little cleaner:
class Algorithm:
@property
def data(self):
return self._data
That allows this usage:
import pandas as pd
alg = Algorithm(...)
d = alg.data
df = pd.DataFrame(d)
It'd be really useful to add type annotations to the docs/examples. ... I don't really understand what you want to say. Type annotations depending of solutions obtained?
For example, macop.evaluators.base.Evaluator.compute
takes an argument of type Solution
. Presumably that's an instance of macop.solutions.base.Solution
. It'd be helpful to have a type hint on that (and even more useful if it's an HTML link).
macop.policies.reinforcement.UCBPolicy
instead of italicizing that text.https://jbuisine.github.io/macop/_build/html/macop/macop.operators.continuous.mutators.html is empty. ... This is intended to make it easier for users already wishing to implement continuous optimization problems.
How does that make it easier for users? To modify Macop, they have to write code/tests and submit a PR, or do a development install.
Hi @stsievert ,
Thanks for your quick answer and advices. I have made some updates:
Thank you. It's a lot better now. It'd be nice to have a specific example of a use case (but that's not critical).
Inside README.md file, I include a quick explanation why Macop is used inside an other project. It has also been added inside the paper.md file.
One example: I mean that I can't easily navigate from macop.algorithms.mono.HillClimberFirstImprovment to macop.evaluators.base, even thought it's a returned by HillClimberFirstImprovment.evaluator.
I use :class:`~macop.XXXX.XXXX`
inside API documentation. It is true that this makes browsing more pleasant. Thanks for the advice.
I've seen your solution. This code is a little cleaner:
I use @property
for specific attributes:
Soution
: data
function (set as @property
and @data.setter
).Soution
: fitness
function (set as @property
and @fitness.setter
).Algorithm
: result
function (set as @property
and @result.setter
). The aim is to make it possible to return and process the data/result that the user wants if he creates a new solution/new algorithm.
I have also updated the "A tour of Macop" documentation regarding these changes.
New comment: it'd be great if you could link to macop.policies.reinforcement.UCBPolicy instead of italicizing that text.
This has been updated.
How does that make it easier for users? To modify Macop, they have to write code/tests and submit a PR, or do a development install.
Yes, the user should be able to propose his modifications if he wants them to be integrated in a new version. The idea was to already fix the way the package would be structured to accommodate such modifications.
@whedon generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Nit: https://jbuisine.github.io/macop/_build/html/macop/macop.policies.classicals.html does not completely describe the API for RandomPolicy
: it's missing the setAlgo
and apply
methods.
It looks like this package has two packs: the discrete algorithms/operators/etc, and the data flow that allows customization. Maybe it'd be better to have that separation on the docs? i.e., have all the base classes and other classes well separated in separate sections/page.
Yes, the user should be able to propose his modifications if he wants them to be integrated in a new version. The idea was to already fix the way the package would be structured to accommodate such modifications.
I'm looking at this package from a user perspective, not a developer or contributor perspective. I think an empty module (e.g., macop.solutions.continuous
) only confuses users (like me) when they click on the documentation link. I think contributors will contact you with a beta implementation, and you can guide them (either in CONTRIBUTING.md or in the pull request).
Thans for your answers.
Here my relative updates:
Display of inheritance classes and members inside API for each class (using 'inherited-members', 'show-inheritance'
keywords for autosummary)
API documentation page is now divided in 4 sections:
@whedon generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Hi @stsievert, @torressa,
I wanted to inform you that I have also updated the paper (only a few changes).
@whedon generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
Submitting author: @jbuisine (Jérôme BUISINE) Repository: https://github.com/jbuisine/macop Version: v1.2.0 Editor: @melissawm Reviewer: @stsievert, @torressa Archive: 10.5281/zenodo.4595986
:warning: JOSS reduced service mode :warning:
Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.
Status
Status badge code:
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@stsievert & @torressa, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @melissawm know.
✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨
Review checklist for @stsievert
Conflict of interest
Code of Conduct
General checks
Functionality
Specifically, this claim is made in paper.md: "Solutions modeling continuous problems can also be created by the anyone who wants to model his own problem", but I'm not seeing a continuous optimization problem in the examples or tests.Documentation
Software paper
Review checklist for @torressa
Conflict of interest
Code of Conduct
General checks
Functionality
Documentation
Software paper