Open wujiaxin-study opened 5 months ago
Deploying open-source LLMs locally is straightforward. Simply replace the API interface and the corresponding function to call the LLM in my code with your local ones. However, please note that due to OpenAI's recent update to their API usage, this project is temporarily unavailable. I will update it as soon as possible to accommodate these changes.
想部署在本地 使用开源大模型 如何操作