Closed ismaeIfm closed 6 years ago
Same issue with lr_decay and clip_gradients. I would suggest using only optimizer as parameter the same way keras handles it
In anaGo 1.0.0, this problem is solved. Thanks!
Example:
model = anago.Sequence(optimizer='rmsprop')
model.fit(x_train, y_train)
If you define a custom optimizer in a training_config to use it with the Trainer class, the optimizer is never used since Adam is hardcoded as an optimizer. https://github.com/Hironsan/anago/blob/master/anago/trainer.py#L38