Closed iperov closed 5 years ago
@iperov Hello, thanks for your PR! However, I don't think this should be a keras-contrib feature. Fix for Adam optimizer should be directy integrated in Keras to avoid a duplicate between keras and keras-contrib. Please consider opening a PR on keras-team/keras !
Adam optimizer should be directy integrated in Keras
I dont think so, because Keras is a standard, that other frameworks should implement. Theano, plaidML, and others cannot implement the function of placing and working with tensors on CPU.
Close then.
Adam is already on keras here. Keras-contrib is just an extension of Keras, meant to test new features before eventually integrating them into keras.
- What I did
added 100% original Adam optimizer with new option
Batch size is very important parameter for GAN networks. So getting rid of optimizer's weights from VRAM, we can train higher batch size, sacrificing 10-20% of time per iteration.
- How I did it
accidentally discovered.
- How you can verify it
add
tf_cpu_mode=1
to Adam and try x2 bigger network