issues
search
SHI-Labs
/
Neighborhood-Attention-Transformer
Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022
MIT License
1.04k
stars
85
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Rule of thumb for dilations value?
#103
rafiibnsultan
opened
2 months ago
1
Fix compatibility issue with newer NATTEN releases
#102
alihassanijr
closed
4 months ago
0
Issue with Neighborhood Attention Model (NAT) Pretrained Weights
#101
Miiitiii
closed
4 months ago
1
Can you release your training log of NAT? I mean, the summary.csv in output folder.
#100
Yonghao-Yu
opened
5 months ago
3
Some comparisons against Deformable Attention
#99
haiphamcse
closed
4 months ago
4
Corrected num_heads in DiNAT-s Large
#98
AdityaKane2001
closed
7 months ago
1
instance segmentation mask2former + dinat
#97
AM97Y
closed
4 months ago
1
For 3D segmentation
#96
LambdaLi
closed
4 months ago
2
How to calculate the number of params?
#95
HanzhouLiu
closed
9 months ago
1
mmdetection on COCO2017 not converge
#94
jamesben6688
closed
10 months ago
1
Cannot repeat the results of Mask2Former+DiNAT-Large on ADE20K
#93
Linwei-Chen
closed
9 months ago
12
training from scratch with different size for height and width
#92
mr17m
closed
9 months ago
3
NAT Tiny performance on ImageNet 1k
#91
jamesben6688
closed
9 months ago
7
About the receptive field of image pixel
#90
money6651626
closed
1 year ago
4
freeze_at be set to 2 to freeze the pretrained weight downloaded from the official website?
#89
aihcyllop
closed
1 year ago
2
Is dectect model available?
#88
chenzx2
closed
1 year ago
2
Fix DiNAT_s issues in det/seg
#87
alihassanijr
closed
1 year ago
0
Is DiNAT code is runnable?
#86
chhkang
closed
1 year ago
2
some problem during train
#85
guoguo1314
closed
1 year ago
9
May I ask whether the code of coco instance segmentation mask2former is dinat or NAT?
#83
aihcyllop
closed
1 year ago
1
Where is natten.py
#82
SantaFlang
closed
1 year ago
0
Is it possible to do upsampling using NAT ?
#81
jimmysue
closed
1 year ago
2
Update NAT bibtex -> CVPR
#80
alihassanijr
closed
1 year ago
0
Fix very minor spelling error
#79
alexmehta
closed
1 year ago
1
Welcome update to OpenMMLab 2.0
#78
vansin
closed
1 year ago
1
How to visualize the attention map?
#77
Amo5
closed
4 months ago
3
Update DiNAT chart
#76
alihassanijr
closed
1 year ago
0
Updated DiNAT-L
#75
alihassanijr
closed
1 year ago
0
ONNX
#74
idanpd
closed
1 year ago
2
Update README.md
#73
OZOOOOOH
closed
1 year ago
0
HF announcement + minor fixes.
#72
alihassanijr
closed
1 year ago
0
News Update
#71
stevenwalton
closed
1 year ago
0
New models and other fixes
#70
alihassanijr
closed
1 year ago
0
While ruuning the code, I got this types of problem. Could you please tell me the solution
#69
Mehulk43
closed
1 year ago
11
Fix requirements
#68
alihassanijr
closed
1 year ago
0
Motivation on choosing NAT depth
#67
oksanadanilova
closed
1 year ago
2
Update checkpoint links
#66
alihassanijr
closed
1 year ago
0
Move NATTEN into a separate repository
#65
alihassanijr
closed
1 year ago
0
Gradcheck fixes
#64
alihassanijr
closed
1 year ago
0
Compile problem
#63
yuhaoliu7456
closed
1 year ago
2
Add `dilations` to DET and SEG
#62
alihassanijr
closed
1 year ago
0
A question about the rpb in LegacyNeighborhoodAttention2D
#61
lartpang
closed
1 year ago
5
Relation to visual attention network (VAN).
#60
MenghaoGuo
closed
1 year ago
4
Legacy Torch implementation for Dilated Neighborhood Attention (DiNAT)
#59
iliaschalkidis
closed
1 year ago
5
abbreviation for rpb
#58
qsh-zh
closed
1 year ago
2
Doc adjustment
#57
alihassanijr
closed
1 year ago
0
DiNAT Checkpoints
#56
alihassanijr
closed
1 year ago
0
DiNAT release
#55
alihassanijr
closed
1 year ago
0
No position encoding? Could you explain some your thoughts?
#54
laisimiao
closed
1 year ago
2
Is it necessary to write dedicated fp16 kernel ?
#53
rayleizhu
closed
1 year ago
8
Next