-
I've run into a bug: If I have a base layer `state` and want to use it to initialize an lstm that's used within the recurrent beam search, the base layer `state` is not wrappend by an `ExtendWithBeamL…
-
### Describe the issue
I am attaching an LSTM I have trained. Its `hidden_output` and `cell_output` do not match between the web (WASM) and node runtimes, and to the extent that I can reproduce them…
-
Hey, doing the basic step, to LSTM model.
``
# Returns a compiled model identical to the previous one
onnx_model = onnx.load("stock_price.onnx")
k_model_onnx = onnx_to_keras(onnx_model, 'lstm_inp…
-
lstm example
-
Getting this error
```
'''
A Recurrent Neural Network (LSTM) implementation example using TensorFlow library.
This example is using the MNIST database of handwritten digits (http://yann.lecun.com…
-
-
when I used net = tflearn.lstm(net, 128) , it is OK. When changed to net = tflearn.gru(net, 128), I got error below:
File "C:\Anaconda3\lib\site-packages\tflearn\layers\recurrent.py", line 294, in …
-
Here is a file that reproduces the problem. This code is copied from the TextAnalysis package and slightly altered for Flux 10. The version in GitHub works with Flux 9
`#=
This code is copied f…
-
hello :
if i want to train this model without char,what should i do?
I want to make a comparison between the model with a char and the model with no char.
-
Hi,
Is there a GPyTorch implementation of this [paper](https://github.com/alshedivat/keras-gp)? If there exist some similar script in the examples, can you please direct me to that?
Thanks