Closed javachens closed 8 months ago
你好,@javachens!很高兴再次见到你。希望你一切都好!
Yes, the Langchain-Chatchat project can be run on a computer without a graphics card. This is based on the information from similar issues in the repository and the provided code context.
In the issue #3439, it was suggested to set device configurations to use CPU by setting LLM_DEVICE
and EMBEDDING_DEVICE
variables to "cpu"
in the configs
module. Also, it was recommended to adjust model loading to ensure compatibility with CPU usage, particularly in get_model_worker_config
and load_local_embeddings
functions.
In the issue #7, it was confirmed that the application can run on a CPU, as tests have been done on macOS and Windows environments using only the CPU.
The provided code does not explicitly require a GPU for its operations. It primarily involves server and API setup, including the creation of FastAPI applications and handling configurations for model workers. The code snippet shows the use of multiprocessing, asynchronous operations, and environment configuration, none of which inherently require a GPU.
However, the specific models being loaded and run by the project could potentially require a GPU for accelerated computation. The models used by Langchain-Chatchat are ernie-bot-4, ernie-bot, ernie-bot-turbo, bloomz-7b, qianfan-bloomz-7b-c, llama2-7b-chat, llama2-13b-chat, llama2-70b-chat, qianfan-llama2-ch-7b, chatglm2-6b-32k, and aquilachat-7b. These models are specified in the MODEL_VERSIONS
dictionary within the provided code context.
Please note that while the application can run on a CPU, the performance might be significantly slower compared to running it on a GPU. Therefore, if performance is a critical factor for your use case, you might want to consider using a system with a GPU.
For more details, you can refer to the mentioned issues and the startup.py and qianfan.py files in the repository.
Can the project be run on a computer without a graphics card?