I think in a scenary in which after sample some parameters combinations with an initial bound set, we want discard manually some areas of parameter search for a second pass of process.
The obvious solution of delete from init_grid_dt all rows in which one parameter be out of the new bounds seems ineficient, so this observation is still very usefull for the correlation matrix estimation.
I've try to run the process with a more restricted set of bounds than the init_grid_dt but my guess is some could be wrong when scaling parameter space to [0, 1] cube with this new bounds (what happens with the old points in history which are out of bounds?).
@yanyachen , do you think is possible this kind of constrains or is necessary modify the code?
I think in a scenary in which after sample some parameters combinations with an initial bound set, we want discard manually some areas of parameter search for a second pass of process.
The obvious solution of delete from init_grid_dt all rows in which one parameter be out of the new bounds seems ineficient, so this observation is still very usefull for the correlation matrix estimation.
I've try to run the process with a more restricted set of bounds than the init_grid_dt but my guess is some could be wrong when scaling parameter space to [0, 1] cube with this new bounds (what happens with the old points in history which are out of bounds?).
@yanyachen , do you think is possible this kind of constrains or is necessary modify the code?