marcotcr / lime

Lime: Explaining the predictions of any machine learning classifier
BSD 2-Clause "Simplified" License
11.64k stars 1.81k forks source link

How to obtain regression model prediction using lime explanations? #639

Open Subh1m opened 3 years ago

Subh1m commented 3 years ago

@marcotcr

I tried an experiment using regression lime and got an explanation. But, I wanted to recreate the local prediction using the explanation feature importances. Can you tell me if there is a way to do that?

From the code, I saw that we are using coefficients from the Ridge model as the feature importance. But, I am unable to get back to the model prediction.

Scenario:

  1. Training Input = X_train, y_train (Regression type dataset)
  2. Testing Input = X_test (single row input)
  3. Trained a DecisionTree model on training input
  4. Created explanations for Testing input using Lime for Regression method (as shown in your sample notebook with discretize_continuous=False)
  5. Also, obtained the local_pred (lime's prediction) and intercept from the explainer.explain_instance() function.

Goal:

  1. To use the explanations, intercept, etc. to recreate the local_pred value.

Please help as this will enable me to validate how we arrive at the feature importance and how is it related to the trained model (Ridge model in this case as lime is using that)

Awesome Library btw. Thanks.