issues
search
JaveyWang
/
Pyramid-Attention-Networks-pytorch
Implementation of Pyramid Attention Networks for Semantic Segmentation.
GNU General Public License v3.0
235
stars
55
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Positioning of RELU in FPA
#13
JohnMBrandt
opened
4 years ago
0
上采样这儿是不是核大小应该为3?
#12
ReckonerInheritor
opened
4 years ago
0
predict mask only 1/4 of original image
#11
sparkfax
opened
4 years ago
0
FPA tensor size unmatch
#10
sparkfax
opened
4 years ago
0
FPA模块中的Global Pooling那个分支最后是否有upsample
#9
wjy66
opened
5 years ago
1
Is this implementation correct ?
#8
shuuchen
opened
5 years ago
0
The lack of Color_Classifier and loading of checkpoints while doing the eval
#7
Kevinduan23
opened
5 years ago
0
Did the classification module help?
#6
chenyzh28
opened
5 years ago
18
The file cls_labels.npy saves the class label of each image, format like {'filename' : '1, 0, 1, ..., 0'}. For simplicity, you can use the label from the x_mask, I think the program may seems like
#5
chezhizhong
opened
5 years ago
3
The file cls_labels.npy saves the class label of each image, format like {'filename' : '1, 0, 1, ..., 0'}. For simplicity, you can use the label from the x_mask, I think the program may seems like
#4
tong1101
opened
5 years ago
20
FPA block
#3
summerrr
closed
5 years ago
1
Softmax/Sigmoid for normalize attention?
#2
John1231983
closed
5 years ago
4
where i can find the file 'cls_labels.npy'
#1
Chenfeng1271
closed
5 years ago
7