Closed Saethox closed 2 years ago
My idea would have been to not support nested components at all, as they complicate things quite a bit. Also concerning the evaluation, I feel like there probably is not much benefit to be this detailed about it. Instead, I would consider the nested local search simply as a black box. Storing all the information sounds nice, but it somewhat feels like we will never be able to analyze it in such detail anyway.
When we want to compare different ILS we can just use the same base LS for all the experiments and it should be sufficient?
@HeleNoir what are your thoughts on this?
I would also prefer not to use any nested components. I think we can solve these issues by using more complex blackbox components and with our scheduler
.
We want to support metaheuristics such as, e.g., Iterated Local Search, where we iteratively start a local search from a random starting point and take the best one out of all runs, in its simplest form.
In this regard, we would then have a LocalSearch
operator that is used after some mutation. I don't think we need much information for the evaluation on this. But if we do need at least some, maybe we can use our state
to some extent.
Or we want to have hybrid metaheuristics, for example running a GA for n iterations, and then using the population to initialize the pheromone matrix of ACO.
This could be solved if we use an appropriate scheduler
and enable the use of several selection
and replacement
operators as well (which might become necessary anyway). It will probably make our postprocess
more complex, however.
@luleyleo @Saethox Does that sound feasible and sufficient?
The thing is, this hypothetical LocalSearch
operator which itself is most probably a Generation
component, needs to evaluate its solution + neighbors somehow. Ergo, we need to pass the Evaluator
of the base configuration into this Generation
component. We could of course also create a new SimpleEvaluator
inside LocalSearch
, but I see no reason not to add a single parameter to support other evaluation methods like GPU evaluation out of the box.
If we then treat this LocalSearch
as blackbox and do not log anything at all, we are in fact supporting nested configurations. This is because a Configuration
needs the following inputs to be executed:
pub fn run<P: Problem>(
problem: &P,
logger: &mut Log,
components: &Configuration<P>,
rng: Option<Random>,
evaluator: Option<Box<dyn Evaluator<P>>>,
) -> P::Encoding {
The problem
is already shared with every component, as is the rng
. The configuration
could be passed when the LocalSearch
component is created, and we can just add another input parameter for evaluator
to Generation
or other components. The logger
can be just some dummy logger, which doesn't do anything.
My point is, if we support components such as LocalSearch
, we are supporting nested configurations as well.
I see your point. Another option: could we solve this by enabling (stupidly) complex scheduler things? Like:
Init repeat
We want to support metaheuristics such as, e.g., Iterated Local Search, where we iteratively start a local search from a random starting point and take the best one out of all runs, in its simplest form. Or we want to have hybrid metaheuristics, for example running a GA for
n
iterations, and then using the population to initialize the pheromone matrix of ACO.This means that we have some component, in which itself a complete (meta)heuristic is run, i.e., a Meta-Component. We currently supply the problem to every component, but we probably also have to do the same for logging and the
Evaluation
, such that possibly nested metaheuristics can use them. Logging especially may be tricky, as we need to indicate that this is a nested run. Maybe add the possibility to add several loggers, which itself log the nested run, to the base logger? But this also means that the export becomes more complicated.This means modifying the parameters of every component, so I first want to discuss if there is another approach that I overlooked, or if we could possibly solve this in a completely different way.