Open lucahuesler opened 1 year ago
To find out the importance of individual factors in the model:
1.SHAP (SHapley Additive exPlanations)
https://towardsdatascience.com/a-novel-approach-to-feature-importance-shapley-additive-explanations-d18af30fc21b https://towardsdatascience.com/a-novel-approach-to-feature-importance-shapley-additive-explanations-d18af30fc21b
In R: https://blog.datascienceheroes.com/how-to-interpret-shap-values-in-r/
2.LIME: Local Interpretable Model-agnostic Explanations
https://homes.cs.washington.edu/~marcotcr/blog/lime/ R-package: https://lime.data-imaginist.com/ https://www.tmwr.org/explain.html
To find out the importance of individual factors in the model:
1.SHAP (SHapley Additive exPlanations)
https://towardsdatascience.com/a-novel-approach-to-feature-importance-shapley-additive-explanations-d18af30fc21b https://towardsdatascience.com/a-novel-approach-to-feature-importance-shapley-additive-explanations-d18af30fc21b
In R: https://blog.datascienceheroes.com/how-to-interpret-shap-values-in-r/
2.LIME: Local Interpretable Model-agnostic Explanations
https://homes.cs.washington.edu/~marcotcr/blog/lime/ R-package: https://lime.data-imaginist.com/ https://www.tmwr.org/explain.html