Is your feature request related to a problem? Please describe.
Currently manager.stop_optimization will only cancel scheduled trials, but is unable to stop already running ones. This is a limitation of Dask (or actually Python itself) as described in this so thread.
Describe the solution you'd like
Make stop_optimization emit a signal to all running trials (e.g. using variable) and make sure trial is periodically checking for this signal, as we can't assume users will do it manually in objective functions. If trial reads reads stop signal, optimization should be immediately halted.
Rely on user to check stopping condition in objective function (this would leak internal implementation details, diverge from Optuna behavior and besides, operations within objective function can block for arbitrary amount of time, delaying signal checks anyway).
Check stopping condition in objective function wrapper (since objective function is assumed to be blocking for arbitrary amount of time).
Spawn a thread periodically checking stopping condition, and send interrupt signal to thread running objective function (objective function is never executed in workers main thread, where signals are handled).
Use concurrency (objective function is not expected to be awaitable, and we are mostly dealing with CPU bound tasks anyway).
Is your feature request related to a problem? Please describe. Currently
manager.stop_optimization
will only cancel scheduled trials, but is unable to stop already running ones. This is a limitation of Dask (or actually Python itself) as described in this so thread.Describe the solution you'd like Make
stop_optimization
emit a signal to all running trials (e.g. using variable) and make sure trial is periodically checking for this signal, as we can't assume users will do it manually in objective functions. If trial reads reads stop signal, optimization should be immediately halted.Describe alternatives you've considered Do nothing and wait for Dask to implement thread interrupts as discussed in https://github.com/dask/distributed/issues/4694.
Additional context