issues
search
tengshaofeng
/
ResidualAttentionNetwork-pytorch
a pytorch code about Residual Attention Network. This code is based on two projects from
681
stars
166
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Errors when I try to run train.py
#40
puppy2000
opened
2 years ago
2
stage 0
#39
chen-yuu
opened
3 years ago
0
transfer learning
#38
chatgptcoderhere
opened
3 years ago
0
Errors when I run train.py
#37
KrystalCWT
opened
3 years ago
2
Errors when I run train.py
#36
KrystalCWT
closed
3 years ago
0
Questions about the performance on ImageNet
#35
jianghaojun
opened
4 years ago
0
what's the version of torch, torchvision and python?
#34
3learning12340
opened
4 years ago
1
i think the num of params for cifar10 residual network is incorrect
#33
Prines
opened
4 years ago
0
have you ever tested the num of theparams
#32
Prines
closed
4 years ago
0
During the test, cifar10, the output data structure is incorrect.
#31
xk88372527
opened
4 years ago
1
What is the meaning of `softmax` in attention_module.py?
#30
theodoruszq
opened
4 years ago
0
A Inputsize Question
#29
onlyonewater
closed
4 years ago
1
Hi,is there any impletation of visualizing the mask? i'm insterest in the mask they showed in the paper,it seems very good
#28
Danie1Hayes
opened
4 years ago
1
model = ResidualAttentionModel() error with python3
#27
carol007
opened
5 years ago
1
The error about if __name__ == '__main__': freeze_support()
#26
November666
opened
5 years ago
1
Error : Data must be sequence , got float
#25
Jayant1234
closed
5 years ago
0
Focus of the attention mask
#24
aezco
opened
5 years ago
0
about the code "out_interp = self.interpolation1(out_middle_2r_blocks) + out_down_residual_blocks1"
#23
ZachZou-logs
opened
5 years ago
4
Expression of mix attention
#22
fengshenfeilian
opened
5 years ago
2
Traceback (most recent call last): File "train.py", line 20, in <module> from model.residual_attention_network import ResidualAttentionModel_92_32input_update as ResidualAttentionModel ImportError: No module named model.residual_attention_network
#21
ZZH2950228
opened
5 years ago
0
分类精度低的问题如何解决,整体精度95%?
#20
13331112522
closed
5 years ago
3
model_92_sgd.pkl is pre_trained for cifar10?
#19
jguo16
closed
5 years ago
1
model_92_sgd.pkl is pre_trained for cifar10?
#18
jguo16
opened
5 years ago
1
下面是我运行某epoch的结果,我想问一下:为什么分类测试精度这么低?
#17
gden138
closed
5 years ago
5
extending it to 3d data
#16
Nd-sole
opened
5 years ago
0
Mixed attention、Channel attention and Spatial attention
#15
YANYANYEAH
opened
5 years ago
4
Shouldn't you record grad when testing?
#14
PistonY
closed
5 years ago
6
请问residual_attention_network.py里的各个类有什么区别?
#13
marsggbo
closed
5 years ago
1
话题关闭
#12
sankin1770
closed
5 years ago
2
请问在AttentionModule_stage1_cifar函数中原论文结构这里没在上采样后加 out_trunk这一步骤吧 如下
#11
sankin1770
closed
6 years ago
3
can you provide the pretrained model
#10
cltdevelop
closed
6 years ago
2
new() received an invalid combination of arguments
#9
HowardZhang1994
closed
6 years ago
4
pretrained network
#8
Xingxu1996
closed
6 years ago
1
How to generate the masks given in the paper?
#7
jain-avi
closed
6 years ago
2
About multi-label
#6
josianerodrigues
closed
6 years ago
3
attention map
#5
Xingxu1996
closed
6 years ago
1
Test Accuracy Stagnates
#4
jain-avi
closed
6 years ago
7
TypeError
#3
zhaobingbingbing
closed
6 years ago
1
ERROR: Unexpected bus error encountered in worker. This might be caused by insufficient shared memory (shm).
#2
aovoc
closed
6 years ago
1
It seems that the this code reproduced results can not achieve the results in the original paper ?
#1
YihangLou
opened
6 years ago
106