torrvision / crfasrnn

This repository contains the source code for the semantic image segmentation method described in the ICCV 2015 paper: Conditional Random Fields as Recurrent Neural Networks. http://crfasrnn.torr.vision/
Other
1.34k stars 462 forks source link

The train net output loss and iteration loss are always same #87

Open dachengxiaocheng opened 7 years ago

dachengxiaocheng commented 7 years ago

Hi all I am a new beginner of CRF-RNN. I had trained a FCN8s model using a public material dataset MINC. Then I planned to train a CRF-RNN model based on this FCN8s model. A net architecture similar as TVG_CRFRNN_COCO_VOC_TRAIN_3_CLASSES.prototxt is employed for training. Below is my solver.prototxt:

net: "CRFRNN_train_test.prototxt" test_iter: 1798 test_interval: 999999999 display: 100

average_loss: 1798

lr_policy: "fixed" base_lr: 1e-13

momentum: 0.99 iter_size: 1 max_iter: 50000 weight_decay: 0.0005

snapshot: 1000 snapshot_prefix: "./models/" test_initialization: false solver_mode: GPU

In my training process, the train net output loss and iteration loss are always same. I feel very confused about this because I had set average_loss: 1798 in solver.prototxt. And the network also seems not converge. Below is the output of training:

I1204 21:45:52.125540 23363 solver.cpp:242] Iteration 0, loss = 4215.64 I1204 21:45:52.125576 23363 solver.cpp:258] Train net output #0: loss = 4215.64 ( 1 = 4215.64 loss) I1204 21:45:52.125587 23363 solver.cpp:571] Iteration 0, lr = 1e-14 I1204 21:50:57.908535 23363 solver.cpp:242] Iteration 100, loss = 3425.36 I1204 21:50:57.908574 23363 solver.cpp:258] Train net output #0: loss = 3425.36 ( 1 = 3425.36 loss) I1204 21:50:57.908587 23363 solver.cpp:571] Iteration 100, lr = 1e-14 I1204 21:55:49.283299 23363 solver.cpp:242] Iteration 200, loss = 3252.26 I1204 21:55:49.283339 23363 solver.cpp:258] Train net output #0: loss = 3252.26 ( 1 = 3252.26 loss) I1204 21:55:49.283347 23363 solver.cpp:571] Iteration 200, lr = 1e-14 I1204 22:00:58.413457 23363 solver.cpp:242] Iteration 300, loss = 842.061 I1204 22:00:58.413498 23363 solver.cpp:258] Train net output #0: loss = 842.061 ( 1 = 842.061 loss) I1204 22:00:58.413512 23363 solver.cpp:571] Iteration 300, lr = 1e-14 I1204 22:05:52.236604 23363 solver.cpp:242] Iteration 400, loss = 5119.25 I1204 22:05:52.236644 23363 solver.cpp:258] Train net output #0: loss = 5119.25 ( 1 = 5119.25 loss) I1204 22:05:52.236654 23363 solver.cpp:571] Iteration 400, lr = 1e-14 I1204 22:10:45.105113 23363 solver.cpp:242] Iteration 500, loss = 6157.21 I1204 22:10:45.105150 23363 solver.cpp:258] Train net output #0: loss = 6157.21 ( 1 = 6157.21 loss) I1204 22:10:45.105160 23363 solver.cpp:571] Iteration 500, lr = 1e-14 I1204 22:15:35.751629 23363 solver.cpp:242] Iteration 600, loss = 3508 I1204 22:15:35.751672 23363 solver.cpp:258] Train net output #0: loss = 3508 ( 1 = 3508 loss) I1204 22:15:35.751685 23363 solver.cpp:571] Iteration 600, lr = 1e-14 I1204 22:20:26.621906 23363 solver.cpp:242] Iteration 700, loss = 10248.3 I1204 22:20:26.621947 23363 solver.cpp:258] Train net output #0: loss = 10248.3 ( 1 = 10248.3 loss) I1204 22:20:26.621960 23363 solver.cpp:571] Iteration 700, lr = 1e-14 I1204 22:25:15.441035 23363 solver.cpp:242] Iteration 800, loss = 88.0998 I1204 22:25:15.441072 23363 solver.cpp:258] Train net output #0: loss = 88.0998 ( 1 = 88.0998 loss) I1204 22:25:15.441082 23363 solver.cpp:571] Iteration 800, lr = 1e-14 I1204 22:29:59.302548 23363 solver.cpp:242] Iteration 900, loss = 2721.73 I1204 22:29:59.302583 23363 solver.cpp:258] Train net output #0: loss = 2721.73 ( 1 = 2721.73 loss) I1204 22:29:59.302593 23363 solver.cpp:571] Iteration 900, lr = 1e-14

Any ideas about my problem? Many thanks for your help!

Best

dachengxiaocheng commented 7 years ago

Hi all

I had fixed this problems. Thanks!

nk-dev0 commented 7 years ago

What was the fix?

zeng-hello-world commented 6 years ago

Hi, @dachengxiaocheng I have the same problem with you, no matter how large the lr_mult I set the loss are same always. How do you fix this? Thank you!