C:\Users\aiwinsor\Documents\dev\ai-voice-cloning>runtime\python.exe .\src\main.py
C:\Users\aiwinsor\AppData\Roaming\Python\Python39\site-packages\torch\autocast_mode.py:162: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
Traceback (most recent call last):
File "C:\Users\aiwinsor\Documents\dev\ai-voice-cloning\src\main.py", line 18, in
from utils import *
File "C:\Users\aiwinsor\Documents\dev\ai-voice-cloning\src\utils.py", line 41, in
from tortoise.api_fast import TextToSpeech as Toroise_TTS_Hifi
File "C:\Users\aiwinsor\Documents\dev\ai-voice-cloning\src\tortoise\api_fast.py", line 114, in
def format_conditioning(clip, cond_length=132300, device="cuda" if not torch.backends.mps.is_available() else 'mps'):
AttributeError: module 'torch.backends' has no attribute 'mps'
C:\Users\aiwinsor\Documents\dev\ai-voice-cloning>pause
Press any key to continue . . .
C:\Users\aiwinsor\Documents\dev\ai-voice-cloning>set PYTHONUTF8=1
C:\Users\aiwinsor\Documents\dev\ai-voice-cloning>runtime\python.exe .\src\main.py C:\Users\aiwinsor\AppData\Roaming\Python\Python39\site-packages\torch\autocast_mode.py:162: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling') Traceback (most recent call last): File "C:\Users\aiwinsor\Documents\dev\ai-voice-cloning\src\main.py", line 18, in
from utils import *
File "C:\Users\aiwinsor\Documents\dev\ai-voice-cloning\src\utils.py", line 41, in
from tortoise.api_fast import TextToSpeech as Toroise_TTS_Hifi
File "C:\Users\aiwinsor\Documents\dev\ai-voice-cloning\src\tortoise\api_fast.py", line 114, in
def format_conditioning(clip, cond_length=132300, device="cuda" if not torch.backends.mps.is_available() else 'mps'):
AttributeError: module 'torch.backends' has no attribute 'mps'
C:\Users\aiwinsor\Documents\dev\ai-voice-cloning>pause Press any key to continue . . .