SelfExplainML / PiML-Toolbox

PiML (Python Interpretable Machine Learning) toolbox for model development & diagnostics
https://selfexplainml.github.io/PiML-Toolbox
Apache License 2.0
910 stars 110 forks source link

Unable to plot feature importance plot for the binary classification target varaible seperately #54

Open munchcrunch opened 1 month ago

munchcrunch commented 1 month ago

My target variable is a binary label containing two variables, such as backed clay and unbacked clay. My question is, I didn't find the option to plot feature importance figures separately for both classes. I want to plot the backed and unbacked separately, but I can't do it. Also, I have tried to plot partial dependence plot separately for both classes but couldn't find a way to do it. Please guide and assist me in this regard. Thank you

ZebinYang commented 3 weeks ago

Hi @munchcrunch,

Binary classification is a special case of regression, with only 0 and 1 as outputs.

The feature importance and partial dependence are calculated based on the whole model, and there is no separate feature importance / partial dependence for each class.