-
Is there a way to train a bidirectional RNN (like LSTM or GRU) on trax nowadays?
-
There should be an option to add a bidirectional recurrent neural network using the three core RNN cells.
-
Hi @JonathanRaiman
your rnn package looks clean and easily understandable, I am gonna test it out, but i have question, do you intend to implement a bidirectional RNN?
hycis updated
9 years ago
-
### 📚 The doc issue
ProblematicLink: https://pytorch.org/docs/stable/generated/torch.nn.RNN.html
Issue: Right below the class header information, there is a code snippet, starting with the following…
-
Follow-up to #4915.
We should document the exact memory layout of initial states, final states, and outputs in the case of bidirectional RNNs. It is not obvious how tensors from the two directions are…
-
I am following the example set in the README, and I am getting the following error. Is this familiar to anyone?
```
In [1]: from pase.models.frontend import wf_builder …
delip updated
3 months ago
-
Seems to be not supported yet.
I think it is contributed welcome.
-
Hopefully this is not stupid. In the Tensorflow tutorial, the states of the RNN are updated after each time-step and used as the states for the next time-step
output, state = lstm(current_batch_of_wo…
-
Hi, I am wondering how to implement bidirectional RNNs, instead of reversing the sequence beforehand,
by internally changing the order to feed.
take LSTM as example, can one just modify `x[{{}, t}]` …
-
It seems than Bidirectional Rnn and multi-layer Rnn can't be supported? Do you have plans to support these?