Have I written custom code (as opposed to using example directory): yes
OS Platform and Distribution: Linux Ubuntu 18.04.2
TensorFlow backend (yes / no): yes
TensorFlow version: 1.14.0
Keras version: 2.2.4
Python version: 3.7.4
CUDA/cuDNN version: -
GPU model and memory: -
Current behavior
Memory leaks when calculating the gradient of an output of a LSTM model in respect to the input.
Expected behavior
No memory leak
Code to reproduce the issue
import numpy as np
import keras
import keras.backend as K
def min_leaking_function(model, sample):
grads = K.gradients(model.output, model.input)[0]
func = K.function([model.input], [grads])
for i in range(10000):
print(i)
func([sample])
System information
Have I written custom code (as opposed to using example directory): yes OS Platform and Distribution: Linux Ubuntu 18.04.2 TensorFlow backend (yes / no): yes TensorFlow version: 1.14.0 Keras version: 2.2.4 Python version: 3.7.4 CUDA/cuDNN version: - GPU model and memory: - Current behavior Memory leaks when calculating the gradient of an output of a LSTM model in respect to the input. Expected behavior No memory leak Code to reproduce the issue
import numpy as np import keras import keras.backend as K def min_leaking_function(model, sample): grads = K.gradients(model.output, model.input)[0] func = K.function([model.input], [grads]) for i in range(10000): print(i) func([sample])
nn = keras.models.Sequential() nn.add(keras.layers.LSTM(20, input_shape=(400, 4))) nn.add(keras.layers.Dense(5)) nn.compile(optimizer='adam', loss='mse') #Doesnt matter some_sample = np.random.rand(1,400,4) min_leaking_function(nn, some_sample)