christophM / interpretable-ml-book

Book about interpretable machine learning
https://christophm.github.io/interpretable-ml-book/
Other
4.72k stars 1.04k forks source link

An Equation in Section 9.6.5 SHAP Feature Importance #318

Closed laugoon closed 2 years ago

laugoon commented 2 years ago

Dear sir,

The book Interpretable Machine Learning is really interesting! But I'm confused about an equation in Section 9.6.5 SHAP Feature Importance: image As written in the book: 'average the absolute Shapley values per feature across the data', I think the ϕ_j^i is the feature attribution for a feature j, instance i, the Shapley value. And the I_j is the feature importance value of feature j. So why doesn't the equation look like this: image Please correct me if I am wrong, and forgive my poor English. Thanks for your reading, I'm looking forward to hearing from you. Lagoon

christophM commented 2 years ago

Yes, that's a mistake. Thanks for notifying me! It's fixed now.