Closed akskuchi closed 1 year ago
Hi, thanks for the question!
QA are not concatenated. Shapley Values explain a certain answer, meaning that they represent how the input tokens contributed towards that answer. Therefore like in the ISA case, we let the model predict an answer and to compute Shapley Values, we look at how the probability for that answer changes when we mask the inputs in many combinations (eq. 1 in the paper).
I hope this clears it up, if not, I am happy to answer further questions.
Yes, that explains. Thank you for the quick response 👍🏽
Hello,
Thank you for your work!
I am trying to understand how the reported Shapley values were estimated for the VQA/GQA tasks. Here are some specific questions:
argmax
probability?