thank you for expanding the feature importance function with flexible permutation times in the machine learning module in the latest update! While the permutation-based feature importance is sometimes unstable and without consideration of feature dependence, would it be possible to add SHAP analysis / shapley value explanation to machine learning module?
Purpose
improve the explainability of machine learning models
Use-case
No response
Is your feature request related to a problem?
No response
Is your feature request related to a JASP module?
Machine Learning
Describe the solution you would like
add SHAP analysis into machine learning module, either under each of machine learning models or a separate function, and give mean of shapley values of each feature in addition to case-wise explanation
Description
thank you for expanding the feature importance function with flexible permutation times in the machine learning module in the latest update! While the permutation-based feature importance is sometimes unstable and without consideration of feature dependence, would it be possible to add SHAP analysis / shapley value explanation to machine learning module?
Purpose
improve the explainability of machine learning models
Use-case
No response
Is your feature request related to a problem?
No response
Is your feature request related to a JASP module?
Machine Learning
Describe the solution you would like
add SHAP analysis into machine learning module, either under each of machine learning models or a separate function, and give mean of shapley values of each feature in addition to case-wise explanation
Describe alternatives that you have considered
No response
Additional context
some possible related package https://cran.r-project.org/web/packages/shapr/vignettes/understanding_shapr.html https://cran.r-project.org/web/packages/explainer/index.html https://shap.readthedocs.io/en/latest/