Open wb-finalking opened 5 years ago
python train.py
Using TensorFlow backend.
train_file model/vgg16_no_top.h5
model_weights train_pair.txt
0 input_1
1 block1_conv1
2 block1_conv2
3 block1_pool
4 dropout_1
5 block2_conv1
6 block2_conv2
7 block2_pool
8 dropout_2
9 block3_conv1
10 block3_conv2
11 block3_conv3
12 block3_pool
13 dropout_3
14 block4_conv1
15 block4_conv2
16 block4_conv3
17 block4_pool
18 dropout_4
19 block5_conv1
20 block5_conv2
21 block5_conv3
22 dropout_5
23 C4_cfe_cfe0
24 C4_cfe_cfe1_dilation
25 C4_cfe_cfe2_dilation
26 C4_cfe_cfe3_dilation
27 C5_cfe_cfe0
28 C5_cfe_cfe1_dilation
29 C5_cfe_cfe2_dilation
30 C5_cfe_cfe3_dilation
31 C3_cfe_cfe0
32 C3_cfe_cfe1_dilation
33 C3_cfe_cfe2_dilation
34 C3_cfe_cfe3_dilation
35 C4_cfeconcatcfe
36 C5_cfeconcatcfe
37 C3_cfeconcatcfe
38 C4_cfe_BN
39 C5_cfe_BN
40 C3_cfe_BN
41 C4_cfe_relu
42 C5_cfe_relu
43 C3_cfe_relu
44 C4_cfe_up2
45 C5_cfe_up4
46 C345_aspp_concat
47 C345_ChannelWiseAttention_withcpfe_GlobalAveragePooling2D
48 dense_1
49 dense_2
50 C345_ChannelWiseAttention_withcpfe_reshape
51 C345_ChannelWiseAttention_withcpfe_repeat
52 C345_ChannelWiseAttention_withcpfe_multiply
53 C345_conv
54 C345_BN
55 C345_relu
56 C345_up4
57 spatial_attention_1_conv1
58 spatial_attention_2_conv1
59 attention1_1_BN
60 attention2_1_BN
61 C2_conv
62 attention1_1_relu
63 attention2_1_relu
64 C1_conv
65 C2_BN_BN
66 spatial_attention_1_conv2
67 spatial_attention_2_conv2
68 C1_BN_BN
69 C2_BN_relu
70 attention1_2_BN
71 attention2_2_BN
72 C1_BN_relu
73 C2_up2
74 attention1_2_relu
75 attention2_2_relu
76 C12_concat
77 spatial_attention_add
78 C12_conv
79 activation_1
80 C12_BN
81 repeat_1
82 C12_relu
83 C12_atten_mutiply
84 fuse_concat
85 sa
Traceback (most recent call last):
File "train.py", line 72, in
I have the same problem with you, can you give me a help? Thanks! @zwm19950414
I use your 'train.py' to train the model, but cant get the result given in your paper. So I guess this training configuration is different from your final version. Could you help me and point out the difference? Thanks a lot.