aerdem4 / lofo-importance

Leave One Feature Out Importance
MIT License
810 stars 83 forks source link

Compatibility with neural network: replacing with constant value instead of dropping the feature #52

Open stephanecollot opened 1 year ago

stephanecollot commented 1 year ago

Hi,

For neural network if you change the number of features, you need to change the input dimension and therefore the number of neurons. So, we could have an option like:

What do you think?

aerdem4 commented 1 year ago

Makes sense. Would this be a flag to set?

I have only one concern. Most BN implementations etc make safe normalization but someone may have a custom model which assumes all features to have non-zero std. So replacing with a small random noise could also be an option.

stephanecollot commented 1 year ago

yes, could be different "leaving strategies", I think a string field would give more flexibility. Good point for the batch normalization and the risk of dividing by zero.

Side remark: with NN you will most probably use FLOFO instead of LOFO anyway, at least that fit my current use case, because I have too many features and training takes too long.