-
Hello hiro. First of all, thank you for your awesome project! It's helping me a lot!
Anyways, I was trying to train MMA decoder for librispeech, with the asr_conf
- conf/asr/mma/streaming/lc_trans…
-
http://preview.d2l.ai.s3-website-us-west-2.amazonaws.com/d2l-en/master/chapter_attention-mechanisms/transformer.html
PyTorch:
go . => !, bleu 0.000
i lost . => je l'ai vu ., bleu 0.000
so lon…
-
I tried this project on windows without cuda.
1. the `prepare_translation_batch` should change to `prepare_seq2seq_batch`
2. the model.generate() throw AttributeError: ‘list’ object has no attribut…
-
### Question
I'm trying to fine tune seq2seq model using fork command and I got this message error
target contains elements out of valid range [0, num_categories) in categorical cross entropy
####…
-
Good evening here,
This looks awesome!
I'm trying to get a transcription from the pre-trained French for a `.wav` file of **53 secs.**
Here's my code:
```python
from speechbrain.pretrained im…
-
I have trained seq2seq AM on Hindi Devanagari data, and a kenlm on devanagari corpus. The results are satisfactory when I am Decoding.
I want to take inference using the inference docker with simple_…
-
## 🐛 Bug
Background
- After working on a Seq2Seq model with attention using _only_ LightingModules, I was still getting on_device errors.
- This also occurred with `PackedSequence` objects but I …
-
Hello!
I want to fine tune unlikelihood repetition model on custom task that is similar to the wizard of Wikipedia task.
Here is my code
```
from parlai.scripts.train_model import TrainModel
fr…
-
# ❓ Questions & Help
## Details
One peculiar finding is that when we ran the rag-sequence-nq model along with the provided wiki_dpr index, all models and index files were used as is, on the …
-
I am trying to train a seq2seq model using BartModel. As per BartTokenizer documentation if I pass tgt_texts then it should return decoder_attention_mask and decoder_input_ids please check the attachm…