thomasp85 / lime

Local Interpretable Model-Agnostic Explanations (R port of original Python package)
https://lime.data-imaginist.com/
Other
486 stars 110 forks source link

Very poor r^2 #138

Closed timcdlucas closed 5 years ago

timcdlucas commented 5 years ago

I can make a reproducible example for this if you'd like. But I'm currently on my phone. I don't know if it needs specifics.

With some models I'm currently running, I regularly get very poor r^2 values for the explanation. The response variable is continuous. I'm using 'auto' feature selection and asking for 10 out of 40 visits. R^2 = 0.02 is common. However, other data points get r^2 more like 0.3. Which is ok ish.

This is happening with a random forest model and a Gaussian process. So it isn't due to tree discontinuities.

Obviously, if theres no fix, I just don't trust those explanations. But I'm surprised it's so low.

Make42 commented 5 years ago

Maybe this is due to this bug: https://github.com/thomasp85/lime/issues/148