Open douxiao opened 6 years ago
What about the performance? You can try a gradually decreasing learning rate.
How to change the code can reduce the learning rate. I'm also trying to reduce the learning rate, but I don't know how to change it? In the train.py file? Should change that part?
See the latest code. keras.callbacks.ReduceLROnPlateau
is used.
Hi, I was training my own data and after 50 epochs, it doing the unfreeze process but I met the gpu ran out of memory. how can I avoid this problem? And is that ok if I just skip this step?? @@ @qqwweee
@franklu323 set smaller batch_size
Hi I want know what setting changed when you train you own data.just fix path and yolo3.cfg?
性能如何?您可以尝试逐渐降低学习率。
Hello, I used the five categories in the voc2012 dataset. About 8700 images were trained 100 times from the beginning. I did not use the pre-training weights, and the final loss was about 14. But the effect of the test is not ideal. Some test images don't even have a box. I want to ask what is the situation. Is it still necessary to continue training to reduce the loss?
修改的效果可以了吗?
微信 lyb864770486 能讨论一下吗
Hello, the code you wrote is very good. When I used to train my own dataset, the following problems occurred. My dataset was only one class and the number of pictures was 1000, and the training pre-training weight was yolo_weights.h5. The final loss value always stops at about 14, not falling. Do you have any good suggestions for me? Many thanks!
1/50 [..............................] - ETA: 26s - loss: 14.6514 2/50 [>.............................] - ETA: 25s - loss: 14.7824 3/50 [>.............................] - ETA: 25s - loss: 14.8408 4/50 [=>............................] - ETA: 25s - loss: 14.7723 5/50 [==>...........................] - ETA: 24s - loss: 14.8877 6/50 [==>...........................] - ETA: 24s - loss: 14.7377 ....................... 49/50 [============================>.] - ETA: 1s - loss: 14.8550 50/50 [==============================] - 61s 1s/step - loss: 14.8404 - val_loss: 14.7532 Epoch 00034: early stopping