Open vankhoa21991 opened 3 years ago
Hello,
I tried to code the visualization of decoder and encoder attention on images as in detr example in https://colab.research.google.com/github/facebookresearch/detr/blob/colab/notebooks/detr_attention.ipynb but didn't succeed.
Could you provide similar examples?
Thanks
Hi, have you success in this model? If so could you provide your code?
@vankhoa21991 Hi, have you success in the visualization of decoder and encoder attention on images? If so could you release your code?
@vankhoa21991 @EllieSeven Hi, have you success in the visualization of decoder and encoder attention on images? If so could you release your code?
@vankhoa21991 @EllieSeven Hi, have you success in the visualization of decoder and encoder attention on images? If so could you release your code?
Actually, I did not finish the visualization. But I make some progress. You can refer to online tutorials on visualizing attention map in transformer, and replace the formula of computing self attention with the formula in DETR. And I notice that there exist two attention formula in this paper, including self-attention and cross-attention. You can choose which one you need to visualize ( I guess it maybe cross-attention).
@vankhoa21991 @EllieSeven Hi, have you success in the visualization of decoder and encoder attention on images? If so could you release your code?
Actually, I did not finish the visualization. But I make some progress. You can refer to online tutorials on visualizing attention map in transformer, and replace the formula of computing self attention with the formula in DETR. And I notice that there exist two attention formula in this paper, including self-attention and cross-attention. You can choose which one you need to visualize ( I guess it maybe cross-attention).
Can you do it? I'm also stuck on the Attention visualisation!
Hello,
I tried to code the visualization of decoder and encoder attention on images as in detr example in https://colab.research.google.com/github/facebookresearch/detr/blob/colab/notebooks/detr_attention.ipynb but didn't succeed.
Could you provide similar examples?
Thanks