dmlc / keras

Deep Learning library for Python. Convnets, recurrent neural networks, and more. Runs on MXNet, Theano or TensorFlow.
http://keras.io/
Other
125 stars 34 forks source link

K.gradients: NotImplementedError #86

Open namp opened 7 years ago

namp commented 7 years ago

OK, I'm looking at the mxnet_backend.py and especially on how gradients are calculated since I'm developing a custom optimizer. However K.gradients do not implement any kind of call to mx:

def gradients(loss, variables):
    """Returns the gradients of `variables` (list of tensor variables)
    with regard to `loss`.
    """
    raise NotImplementedError

So my question is, how are gradients actually calculated, say in a call like:

grads=K.gradients(loss, params)

in my optimizer?

I mean, how the heck does even SGD work with the mx backend?

Thanks