-
Hello, thank you for sharing this.
I am getting this error when am trying to run this in Colab
"ValueError: The first argument to `Layer.call` must always be passed."
This is my model code:
…
-
#My System Configurations
**CUDA: 9.1**
**libCUDNN 7.1**
Tensorflow Version: '1.8.0-rc0'
The system works for default Vietnam to English dataset but while training with Bodo English dataset th…
-
We should have utility functions for constructing dense/convolutional layers (eventually more complex layers like LSTM or multihead attention), which take a context, input node identifiers, and initia…
-
- PyTorch-Forecasting version: 1.0.0
- PyTorch version: 2.0.1+cpu
- Python version: 3.10
- Operating System: Ubuntu
### Expected behavior
I followed this guide [here](https://towardsdatascien…
-
https://github.com/graykode/nlp-tutorial/blob/cb4881ebf6683dc6970c53a2cf50d5fd01edf118/4-3.Bi-LSTM(Attention)/Bi-LSTM(Attention)-Torch.py#L50
Hi, this repo is awesome, but there might be something …
-
I intend to try out an LSTM for speech recognition. Looking at the t2t code I noticed that there's a `lstm_asr_v1` hparams-set which is should probably work with a `lstm_seq2seq_attention`?
However…
-
Hi,
I am trying to run the row-less version with the "attention-both-lstm" configuration. I used the demo training data by changing the training variables in script from
`export LOG_ROOT="${TH_R…
-
Could you kindly provide the code for training models, please.
-
Στο αμφίδρομο LSTM με attention έχουμε 2 attention layers (ένα ανά κατεύθυνση)?
-
main.py中当method_name == 'lstm_textcnn_attention',model是Transformer_Attention。