Open CODE-SUBMIT opened 4 years ago
Something seems strange with the learning rate scheduling, as the LR drops to 1e-5 pretty early while I get better results from continuing to train at 1e-4.
Something seems strange with the learning rate scheduling, as the LR drops to 1e-5 pretty early while I get better results from continuing to train at 1e-4.
hello, i have tried the default settings and try to dont drop lr too early but my result are still bad. What are ur result. and can you show me your evaluation and training log
Dear I run the code, however, I can not reproduce the result in the paper, Is there anything else I could do to reproduce the results