Open romanovzky opened 1 month ago
Hi, this looks like an easy improvement, but it will still require some changes to the C++ library. Currently, each call to fit
initializes a new C++ algorithm, runs it, and keep some stats and results (like the pareto front) from it. But when fit
is done the C++ object doesn't exist anymore. However, it should be easy enough to implement a kind of warm start mechanism.
Yes, a warm start mechanism would be super useful! Already thinking about the possibility of using things like hyperband
, which ideally need warm start mechanisms.
Hi,
I was testing whether I could fit a
SymbolicRegressor
up to, say,1000
generations, see the Pareto front, and then continue training for another1000
generations. However, it seems that if I doplay with
reg
and thenagain, the
reg
object is the same as before the secondfit
call. Now, given thatreg
has a Pareto front, wouldn't I be able to continue fitting "a la" online learning/batch/partial fit way? I'm trying to "brute force" a work-around for the lack of callbacks (see https://github.com/heal-research/pyoperon/issues/18).Cheers