ai-se / ResourcesDataDrivenSBSE

Other
4 stars 0 forks source link

need a list of open problems that could fuel future work in this field #16

Closed timm closed 6 years ago

timm commented 6 years ago

that could fuel future work in this field

timm commented 6 years ago
vivekaxl commented 6 years ago
timm commented 6 years ago
minkull commented 6 years ago

learning the optimisers :-D

markuswagnergithub commented 6 years ago

"optimising optimisers" for this idea (?) a team won the best-paper award in the evolutionary combinatorial optimisation track at GECCO 2017 (one of the main evolutionary computation conferences): https://dl.acm.org/citation.cfm?id=3071238 proof: http://gecco-2017.sigevo.org/index.html/Best+Paper+Nominations free version: http://iridia.ulb.ac.be/IridiaTrSeries/link/IridiaTr2017-004.pdf ==> this falls into the greater field of "algorithm configuration" (hey, why not tune the tuners? this was done above, so this is approx two steps above what the practitioner does when default settings are used) Also, the technology used by "algorithm configuration" and even "per-instance parameter setting" goes beyond the application of DE (see Timm's topic modelling) as fun things like heterogeneous instance sets and noise are to be considered. I'd have 2-3 paragraphs on this somewhere. Interested? The question is: where shall this go in this article, since it has not yet been applied in any of the provided cases yet? (unless I missed it) I am working on something in that area, but that is going to take a few months to get accepted + published...

minkull commented 6 years ago

Actually, people have been investigating things such as optimising the optimisers for quite a while in the EA community. However, it seems this has attracted the attention of the SBSE community only more recently. In terms of the related area of learning the optimisers, I have some work on hyperheuristics for the software project scheduling problem, and there must also be some other work out there (?)