Closed RuntimeRacer closed 1 year ago
Ok I figured out that at basically all occassions I could, I turned the wrong way:
icefall
in within WSL filesystem and pointing PYTHONPATH
to it, made the firsterror go away. However, I hit a new error: ModuleNotFoundError: No module named 'matplotlib'
matplotlib
via pip was not possible, because of this friend here: AssertionError: Egg-link /mnt/d/speechexperiments/vall-e does not match installed location of valle (at /home/runtimeracer/speechexperiments/vall-e)
rm /home/runtimeracer/anaconda3/envs/vall-e/lib/python3.10/site-packages/valle.egg-link
and reinitialize it in the proper folder: pip install -e .
matplotlib
FileNotFoundError: [Errno 2] No such file or directory: 'data/tokenized/unique_text_tokens.k2symbols'
- Ok I forgot to setup the dataset. Alright, let's go to examples/libritts
- weird, doesn't work, examples
is just a file why would he use it as path in the setup instruction?cd egs/libritts/
and run bash prepare.sh --stage -1 --stop-stage 3
... but nope: prepare.sh: line 2: $'\r': command not found
and bunch of weird errors from the bash script. 您好,请问您解决了吗? 我也是从Windows上传的ubuntu上的,然后就出现了这种问题
您好,请问您解决了吗? 我也是从Windows上传的ubuntu上的,然后就出现了这种问题
https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/pruned_transducer_stateless2/scaling.py 把下面的代码复制到icefall/transformer_lm/scaling.py这个文件下删除../../egs/librispeech/ASR/pruned_transducer_stateless2/scaling.py
Hi, I am just trying to run inference for testing on the model which was shared here: https://github.com/lifeiteng/vall-e/issues/58#issuecomment-1483700593
For some reason I get this error in BOTH Plain Win10 and WSL:
The command line I executed:
python bin/infer.py --output-dir ./ --model-name valle --norm-first true --add-prenet false --decoder-dim 1024 --nhead 16 --num-decoder-layers 12 --text-prompts "KNOT one point one five miles per hour." --audio-prompts ./prompts/8463_294825_000043_000000.wav --text "To get up and running quickly just follow the steps below." --checkpoint exp/epoch-100.pt
Any help is highly appreciated. :-)