Closed arvoelke closed 4 years ago
If you use h = nengo_dl.Layer(tf.keras.layers.LSTM(units=128))(inp, shape_in=(n_steps, d))
that should work (where n_steps
is the number of timesteps in your data and d
is the dimensionality of the data on each timestep).
Thanks. Got this to work with the following code:
with nengo.Network(seed=seed) as net:
nengo_dl.configure_settings(
trainable=None, stateful=False, keep_history=False,
)
inp = nengo.Node(np.zeros(np.prod(train_images.shape[1:])))
h = nengo_dl.Layer(tf.keras.layers.LSTM(units=128))(
inp, shape_in=(train_images.shape[1], 1))
out = nengo_dl.Layer(tf.keras.layers.Dense(units=10))(h)
p = nengo.Probe(out)
and using train_images.reshape((train_images.shape[0], 1, -1))
in place of train_images
where passed to sim
. Note this passes the entire sequence in one step, rather than the usual Nengo approach of iterating across each time-step in the simulation.
Marking this as resolved since the fix above works!
Steps to reproduce:
docs/examples/lmu.ipynb
Replace the network definition with the following:
I have also tried adding
unroll=True
to the LSTM and/or configuringstateful=True
and/or configuringkeep_history=True
undernengo_dl.configure_settings
.