Closed gogyzzz closed 5 years ago
It's just depend on your dataset, you can put a placeholder in model.py which feed you learning rate into, and when you have run 36 epochs, change the placeholder value in sess.run
The project is not designing to recur the original paper, just for reference only, hope you understand.
@taylorlu Thank you!
In the original paper, decaying step of learning rate is every 36 epochs.
But your code, the step is 5000 iterations.
How can I make the same configuration as in the original paper?
Thank you!