Closed xucian closed 1 year ago
The intention behind the design of DEAP is presented in our 2012 paper in JMLR: https://www.jmlr.org/papers/volume13/fortin12a/fortin12a.pdf . It indeed aimed at exposing the main mechanism instead of encapsulating them in an opaque function.
We felt that the framework was still high-level enough for not requiring single functions to encapsulate specific functions. The examples provided should in fact allows the users to have a code to start from to apply a given algorithm to their own context.
thanks for clarifying! the paper definitely helps. I can see how using lots of examples is more beneficial than having some universal defaults and an easier interface. I used optimization methods for less than 3y now, and I can see how, for example, scipy's implementations are more like a crash course into optimizations while this lib seems more production-grade. definitely something I'll migrate to soon. thanks again!
Hello! I recently discovered this library and I feel a bit overwhelmed by the amount of customizability it has. It looks well-done and having so many stars means it helps a lot of people. 👍
So either I didn't dig enough, or there's no higher level interface over it. Coming from NLOPT's GN_ISRES and SciPy's optimize_differential_evolution, DEAP is pretty intimidating. I'm pretty sure it'd require me a few days to build an interface for my problem (it's a pretty complex eval function, with 30 params). I'm mainly interested in the parallelization part, as SciPy's version leads to some errors.
So am I right that the intended default approach with this lib is to build the algorithm yourself, as opposed to having plenty of "good-enough" defaults that help you provide only the minimum possible information to it, as a quickstart?
Thanks!