@slundberg I have a question. To test Attention Mechanism, I fix the tenth column of input X equal y. This is my code:
def get_lstm_data(n, time_steps, input_dim, attention_col=10):
x = np.random.uniform(size=(n, time_steps, input_dim))
y = np.random.randint(low=0, high=2, size=(n, 1))
x[:, attention_col, :] = np.tile(y[:], (1, input_dim))
return x, y
when model training, X shape is [10000, 20, 2], To use shap.force_plot , input param should be a 2D array, so flatten test input. X_test.shape=[1, 20, 2]
X_test, Y_test = get_lstm_data(1, time_steps, input_dim, attention_col=10)
shap_values = explainer.shap_values(X_test)[0] # shape is [1, 20, 2]
# now shap_values_flatten shape is [1, 40]
shap_values_flatten = shap_values.reshape(shap_values.shape[0], np.prod(shap_values.shape[1:]))
feature_name = ["feature_" + str(i) for i in range(40)]
shap.force_plot(explainer.expected_value[0], shap_values_flatten[0,:], feature_names=feature_name)
howerver, because I fixed the tenth column equal y and input_dim=2, so I think the top contribution feature is feature_10 and feature_30, why the result is feature_12 and feature_38 in picture?
@slundberg I have a question. To test Attention Mechanism, I fix the tenth column of input X equal y. This is my code:
when model training, X shape is [10000, 20, 2], To use shap.force_plot , input param should be a 2D array, so flatten test input. X_test.shape=[1, 20, 2]
howerver, because I fixed the tenth column equal y and input_dim=2, so I think the top contribution feature is feature_10 and feature_30, why the result is feature_12 and feature_38 in picture?