MAIF / shapash

🔅 Shapash: User-friendly Explainability and Interpretability to Develop Reliable and Transparent Machine Learning Models
https://maif.github.io/shapash/
Apache License 2.0
2.73k stars 334 forks source link

New Feature: Subpopulation-based Feature Importance Plots #579

Closed guillaume-vignal closed 1 month ago

guillaume-vignal commented 1 month ago

Description:
This pull request addresses #578 by adding two new plots that help users better understand how feature importance varies across different subpopulations of the data. Fixes #578

Context:

The issue identified a need for more granular insights into feature importance, specifically highlighting features that may have significant local importance within certain subpopulations while remaining less influential globally. This helps in improving model interpretability when working with datasets that have heterogeneous populations.

New Plots Added:

  1. Local Importance Divergence Metric:
    This plot highlights features that differ in importance across subpopulations. It allows users to quickly spot features that are influential only in specific regions of the dataset.

image

  1. Feature Importance Curve Plot:
    This plot provides a visual representation of how feature importance fluctuates across samples. It plots the importance curves for each feature, giving users a way to visually inspect how consistent or varied a feature's importance is throughout the data.

image

Benefits:

Example Use Case:

In a customer segmentation dataset, these plots could show that features like "age" or "income" have varying importance in different customer segments, while features like "purchase history" remain important across all segments.