Currently the arguments of the SkopeRules object are propagated over all decision trees in its bagging classifier.
It means that all the trees share the same parameters (except for max_depth where a list of depths can be passed). The only differences are the samples they fit on.
It would probably make sense if we could also input a grid of parameters.
It means adding a dict grid_parameters optional argument in the SkopeRules object which stores possible values for all tree's parameters. Each tree is then built using a random combination of these parameters if this argument is specified.
It might help to get more diversified rules for a reasonable amount of estimators.
Hello,
Currently the arguments of the SkopeRules object are propagated over all decision trees in its bagging classifier. It means that all the trees share the same parameters (except for
max_depth
where a list of depths can be passed). The only differences are the samples they fit on.It would probably make sense if we could also input a grid of parameters. It means adding a dict
grid_parameters
optional argument in the SkopeRules object which stores possible values for all tree's parameters. Each tree is then built using a random combination of these parameters if this argument is specified.It might help to get more diversified rules for a reasonable amount of estimators.