datawhalechina / self-llm

《开源大模型食用指南》基于Linux环境快速部署开源大模型,更适合中国宝宝的部署教程
Apache License 2.0
6.08k stars 748 forks source link

BUG:按照GLM-4示例不能启动API服务 #174

Closed eternal-bug closed 2 weeks ago

eternal-bug commented 2 weeks ago

我在看self-llm/GLM-4/01-GLM-4-9B-chat FastApi 部署调用.md的时候,想要运行API的代码,我运行:

python api.py

命令行出现下面的消息就退出了:

Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████████████████| 10/10 [00:04<00:00,  2.07it/s]
WARNING:  You must pass the application as an import string to enable 'reload' or 'workers'.

之后查到了一个issueuvicorn can not start FastAPI with example settings #1495

就是把代码中的:

uvicorn.run(app, host='0.0.0.0', port=6006, workers=5)

更改为:

uvicorn.run("api:app", host='0.0.0.0', port=6006, workers=5)

就可以正常运行了,是不是示例忘记了引号?