Open GrigorKirakosyan opened 1 month ago
we released 1B model but not the large model here: https://huggingface.co/spaces/nvidia/parakeet-tdt_ctc-1.1b
Thanks, I have seen this. I was interested to know about Large-FastConformer(~120M) model trained with this config: https://github.com/NVIDIA/NeMo/blob/main/examples/asr/conf/fastconformer/long_fastconformer/fast-conformer-long_ctc_bpe.yaml
Hi NeMo team, Do you plan to release En Large FastConformer-Long-CTC-BPE model trained with Local Attention and Global token?