RVC-Boss / GPT-SoVITS

1 min voice data can also be used to train a good TTS model! (few shot voice cloning)
MIT License
36.47k stars 4.16k forks source link

Can we have a mps? #61

Closed Lorre-Ramon closed 10 months ago

Lorre-Ramon commented 10 months ago

I currently ran into a bit of a problem that may have something to do with my CUDA. I'm a MacBook M1 user, so naturally, I don't have a GPU that fits CUDA. Normally I would expect setting CPU to the device as an alternative, which, for the record, I did see the codes, but it did not work smoothly on my device. Torch has launched mps for Apple Silicon users as an alternative to CUDA, I was wondering when the developer can update this.

The following is the Error I received when I was formatting the train set(1-训练集格式化工具). Maybe I got it all wrong why this error happen, please kindly help solve this.

"/Users/improvise/miniconda/envs/GPTSoVits/bin/python" GPT_SoVITS/prepare_datasets/1-get-text.py
"/Users/improvise/miniconda/envs/GPTSoVits/bin/python" GPT_SoVITS/prepare_datasets/1-get-text.py
Traceback (most recent call last):
  File "/Users/improvise/Desktop/GPT-SoVITS-main/GPT_SoVITS/prepare_datasets/1-get-text.py", line 53, in <module>
    bert_model = bert_model.half().to(device)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/transformers/modeling_utils.py", line 2460, in to
    return super().to(*args, **kwargs)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1160, in to
Traceback (most recent call last):
  File "/Users/improvise/Desktop/GPT-SoVITS-main/GPT_SoVITS/prepare_datasets/1-get-text.py", line 53, in <module>
    bert_model = bert_model.half().to(device)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/transformers/modeling_utils.py", line 2460, in to
    return super().to(*args, **kwargs)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1160, in to
    return self._apply(convert)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 810, in _apply
    module._apply(fn)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 810, in _apply
    return self._apply(convert)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 810, in _apply
    module._apply(fn)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 810, in _apply
    module._apply(fn)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 833, in _apply
    module._apply(fn)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 810, in _apply
    param_applied = fn(param)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/sit    module._apply(fn)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 810, in _apply
e-packages/torch/nn/modules/module.py", line 1158, in convert
    module._apply(fn)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 833, in _apply
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/cuda/__init__.py", line 289, in _lazy_init
    param_applied = fn(param)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1158, in convert
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
  File "/Users/improvise/miniconda/envs/GPTSoVits/lib/python3.9/site-packages/torch/cuda/__init__.py", line 289, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
Traceback (most recent call last):
  File "/Users/improvise/Desktop/GPT-SoVITS-main/webui.py", line 529, in open1abc
    with open(txt_path, "r",encoding="utf8") as f:
FileNotFoundError: [Errno 2] No such file or directory: 'logs/test01/2-name2text-0.txt'

ps. The output in the folder logs is like this: /Users/improvise/Desktop/GPT-SoVITS-main/logs/test01/3-bert, the folder is empty.

hydra-li commented 10 months ago

M1pro too

RoversX commented 10 months ago

Have you guys tried CPU inference? I've currently tested So-vits-svc and bert-vits2 for CPU inference.

RoversX commented 10 months ago

Have you guys tried CPU inference? I've currently tested So-vits-svc and bert-vits2 for CPU inference.

CPU inference Work!

Screenshot 2024-01-18 at 2 09 15 PM
RoversX commented 10 months ago

Have you guys tried CPU inference? I've currently tested So-vits-svc and bert-vits2 for CPU inference.

Screenshot 2024-01-18 at 2 23 51 PM
Lorre-Ramon commented 10 months ago

I tried manually adjusting all 'cuda:0' into 'cuda:0' if torch.cuda.is_available() else "cpu", and it worked.

zhouhao27 commented 10 months ago

Have you guys tried CPU inference? I've currently tested So-vits-svc and bert-vits2 for CPU inference.

CPU inference Work!

Screenshot 2024-01-18 at 2 09 15 PM

How? I can't start webui.py at all.

RoversX commented 10 months ago

Have you guys tried CPU inference? I've currently tested So-vits-svc and bert-vits2 for CPU inference.

CPU inference Work!

Screenshot 2024-01-18 at 2 09 15 PM

How? I can't start webui.py at all.

u use python web.py to start webui