Open karllandheer opened 10 months ago
I have found that this is because difference-np.sum(shap_values) is typically <0.01, which is the threshold used in deep_tf.py, hence why it's not caught automatically by the check_additivity=True. I am not super familiar with SHAP, is the equality not really an equality but subject to some approximations? (i.e., maybe there's no bug, i'm just being accidentally pedantic)
The tolerance that is acceptable is 0.01, so your example lies within that. See here
@karllandheer any objections to closing this issue?
Issue Description
Hello, I am having an issue where the sum of the shap values does not sum to the model prediction - expected_value. The model is in keras. I have included a minimum reproducible example. I tried both regression (i.e., linear last layer), and classification (sigmoid). I believe in both cases this equality should hold, however both were not for me.
Minimal Reproducible Example
Traceback
Expected Behavior
No response
Bug report checklist
Installed Versions
shap==0.42.1 tensorflow==2.13.0 numpy==1.24.3