Closed jayahm closed 3 years ago
Hello,
The library does not support this functionality yet. All models are required to have exactly the same input features.
I see. I saw some papers used the random subspace method. So, I was thinking to try it. Any suggestion on how to it even though the library does not support yet?
@jayahm Hello,
For that to work you need to implement the Random Subspace method outside BaggingClassifier from sklearn. In the case, each base model should be composed of a sklearn pipeline in which the first step is a transformation that selects a subset of the features, and the second step is the classifier model. That each, each base model in the list pool_classifiers receives the same X as input, and they handle the subspace selection by themselves.
Just by using the BaggingClassifier the problem is that each base model in the ensemble generated by the BaggingClassifier receives a subset of X as input directly instead of the full feature set as input and handling the subspace inside. And since there is no way to communicate which features were used for the training of each base, the DS technique cannot know how to correctly distribute between the base models.
I will work on an example showing how to properly use the RandomSubspace model with DESlib considering the current versions of sklearn and deslib, as well as other feature transformations that could be applied to specific base models. I will let you know when it is ready.
Yes, an example would be very helpful as I cannot get what you meant in the explanation above.
You closed the issue, which means you have created the example?
Not yet. It was closed because this example is already being tracked by issue #218 , so there is no need to have both open.
Hi
I was trying to generate a pool of classifiers based on random subspace method suing BaggingClassifier, but got this error:
I used exactly your example code but only modified the bagging part.
Here is the code: https://www.dropbox.com/s/z4iseijtawb53ey/plot_comparing_dynamic_static.ipynb?dl=0