Closed sachinruk closed 6 years ago
This is a cross post from: https://stackoverflow.com/questions/45582185/getting-state-of-predictions-in-lstms
I have since tried to redo the model as:
batch_size = 64 model = Sequential() model.add(Embedding(len_vocab, 64, batch_size=batch_size)) model.add(LSTM(256, return_sequences=True, stateful=True)) model.add(TimeDistributed(Dense(len_vocab, activation='softmax'))) model.compile(loss='sparse_categorical_crossentropy', optimizer='adam') model.summary()
However, now I cannot predict one letter at a time as it is expecting a batch_size of inputs.
batch_size
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.
This is a cross post from: https://stackoverflow.com/questions/45582185/getting-state-of-predictions-in-lstms
I have since tried to redo the model as:
However, now I cannot predict one letter at a time as it is expecting a
batch_size
of inputs.