Open vasselai opened 3 years ago
Actually, an overriding feature_importances_
function seems to be entirely lacking from the MonoRandomForestClassifier
class, isn't it?
Perhaps it can be implemented analogously to the one that exists for BaseMonoGradientBoosting
, like:
@property
def feature_importances_(self):
"""Return the feature importances (the higher, the more important the
feature).
Returns
-------
feature_importances_ : array, shape = [n_features]
"""
total_sum = np.zeros((self.n_features_, ), dtype=np.float64)
for stage in self.estimators_:
stage_sum = sum(rule_ensemble.tree.feature_importances_
for rule_ensemble in stage[0]) / len(stage[0])
total_sum += stage_sum
importances = total_sum / len(self.estimators_)
return importances
So, after a few days double-testing the updated version, after fitting any model, for example the RF one from the official docs, I get the following error raised if I try to access the fitted model's
feature_importances_
:At first I was under the impression that the monoensemble code was just not storing the internal fitted
tree
object inside an internaltree_
, but that did not solve it. There's something else at play. Given the criticality of being able to explore feature importances, I thought best to bring this up officially.