xtts2-ui.venv\lib\site-packages\transformers\models\gpt2\modeling_gpt2.py:650: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455.)
attn_output = torch.nn.functional.scaled_dot_product_attention(
also by this script I get the results below:
import torch
import transformers
from TTS.api import TTS
def check_environment():
print("Checking environment for Flash Attention 2 compatibility...")
# Check CUDA availability
print(f"CUDA available: {torch.cuda.is_available()}")
if torch.cuda.is_available():
print(f"CUDA version: {torch.version.cuda}")
# Check PyTorch version
print(f"PyTorch version: {torch.__version__}")
# Check Transformers version
print(f"Transformers version: {transformers.__version__}")
# Check for Flash Attention 2
if hasattr(torch.nn.functional, 'scaled_dot_product_attention'):
print("Flash Attention 2 is available!")
else:
print("Flash Attention 2 is not available.")
# Check TTS installation
try:
tts = TTS(model_name="tts_models/multilingual/multi-dataset/xtts_v2")
print("TTS model loaded successfully.")
except Exception as e:
print(f"Error loading TTS model: {e}")
print("Environment check complete.")
if __name__ == "__main__":
check_environment()
Results from the script above:
Checking environment for Flash Attention 2 compatibility...
CUDA available: True
CUDA version: 12.1
PyTorch version: 2.3.1+cu121
Transformers version: 4.42.4
Flash Attention 2 is available!
tts_models/multilingual/multi-dataset/xtts_v2 is already downloaded.
Using model: xtts
TTS model loaded successfully.
Environment check complete.
ERROR:
also by this script I get the results below:
Results from the script above:
There solution to that?