chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
31.6k stars 5.51k forks source link

M1Pro芯片跑项目卡住 #3689

Closed YiLin198 closed 6 months ago

YiLin198 commented 6 months ago

问题描述 / Problem Description 启动项目后报错AssertionError: Torch not compiled with CUDA enabled 卡住不动 将config里的auot改成mpu也没用

截屏2024-04-10 上午2 29 27
dosubot[bot] commented 6 months ago

Hey @YiLin198, great to see you diving into some tech puzzles again! 🚀

To address the "Torch not compiled with CUDA enabled" error on your M1 Pro, it's important to note that CUDA is designed for Nvidia GPUs, which the M1 chip does not support. Instead, you can force PyTorch to use the CPU or leverage Apple's Metal Performance Shaders (MPS) for GPU acceleration. Here's how you can adjust your code:

  1. Force PyTorch to Use CPU: Modify your code to ensure it does not attempt to use CUDA features by setting the device to CPU explicitly:

    device = torch.device("cpu")

    This change ensures your code runs on the CPU, avoiding CUDA-related errors.

  2. Use MPS for GPU Acceleration: If you want to utilize the GPU on your M1 chip, you can change the device from "cuda" to "mps" in your configuration. This adjustment was effective in a similar issue and is documented in issue #2990. Here's an example change in configs/server_config.py:

    "chatglm3-6b": {
       "device": "mps",
    },
    "Qwen1.5-0.5B-Chat": {
       "device": "mps",
    },

Regarding the file path not found errors, ensure all file paths in your code are correctly specified and accessible. Check that paths to models, data files, or other resources are correct and follow Unix-style formatting.

If these steps do not resolve your issues or if you encounter other errors, please provide more specific details about the file paths causing errors and the relevant sections of your code for further assistance.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.