Open pacorikos opened 3 years ago
For your first question, you can refer to my response to another similar query.
You can try to implement learning schedule with the optimizer following the example here. I don't think you need to initialize the optimizer for each iteration when you use the scheduler that is provided with the module.
This is a two-fold question. Why are two optimizers used? One is used every iteration and the other is used after the 5,000 iterations. Secondly, does the AdamOptimizer need to be defined in the init method, because I was interested in implementing a learning schedule and initializing a new RMSProp optimizer every iteration but I was wondering how I would do that with their setup.