bentoml / OpenLLM

Run any open-source LLMs, such as Llama, Mistral, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
10.12k stars 641 forks source link

No such option: --model-id error #100

Closed stoneLee81 closed 1 year ago

stoneLee81 commented 1 year ago

Describe the bug

when running cmd to load chatglm or chatglm2, cmd as below: openllm start chatglm --model-id thudm/chatglm2-6b openllm start chatglm --model-id thudm/chatglm-6b

Error: No such option: --model-id and run openllm start chatglm -h to check it shows `Usage: openllm start chatglm [OPTIONS]

chatglm is currently not available to run on your local machine because it requires GPU for inference.

Options: Miscellaneous options: -q, --quiet Suppress all output. --debug, --verbose Print out debug logs. --do-not-track Do not send usage info -h, --help Show this message and exit. `

To reproduce

No response

Logs

No response

Environment

m1max openllm 0.1.20.dev python 3.9.7

System information (Optional)

No response

aarnphm commented 1 year ago

you can't run chatglm atm on CPU.