Closed lucky9-cyou closed 4 months ago
啊是的,不过对gpt而言它的attention是个三角阵,所以剩下的部分saliency也是0,所以写成那样了(写成saliency[class_poss, :class_poss]在class_poss是个列表的时候稍微有点麻烦就是了)
啊是的,不过对gpt而言它的attention是个三角阵,所以剩下的部分saliency也是0,所以写成那样了(写成saliency[class_poss, :class_poss]在class_poss是个列表的时候稍微有点麻烦就是了)
Got it. Thanks for your reply.
Original Code: https://github.com/lancopku/label-words-are-anchors/blob/74b58040999c23285e4a1d221309398f14b0fe57/attention_attr.py#L117
I think it should be
according to
sum
operation in following code: https://github.com/lancopku/label-words-are-anchors/blob/74b58040999c23285e4a1d221309398f14b0fe57/attention_attr.py#L123