rayleizhu / BiFormer

[CVPR 2023] Official code release of our paper "BiFormer: Vision Transformer with Bi-Level Routing Attention"
https://arxiv.org/abs/2303.08810
MIT License
460 stars 36 forks source link

注意力机制 #13

Closed PQCgit closed 1 year ago

PQCgit commented 1 year ago

你好! https://github.com/z1069614715/objectdetection_script/blob/master/cv-attention/Biformer.py 这个仓库是作者从你的代码库摘抄你的代码过去,请问里面的Attention模块是不是多头注意力机制呢

rayleizhu commented 1 year ago

你好! https://github.com/z1069614715/objectdetection_script/blob/master/cv-attention/Biformer.py 这个仓库是作者从你的代码库摘抄你的代码过去,请问里面的Attention模块是不是多头注意力机制呢