bentoml / OpenLLM

Run any open-source LLMs, such as Llama, Gemma, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
10.05k stars 636 forks source link

feat: change port number #547

Closed japri closed 1 year ago

japri commented 1 year ago

Feature request

how to change port number from default 3000

Motivation

i need to solve the port confilcting in my system.

Other

No response

aarnphm commented 1 year ago

You can alreaady change this with --port. All bentoml serve arguments are passed directly to openllm start

japri commented 1 year ago

hi, thanks, i try it and it works to change the port using this command openllm start opt --port 3333

but unfortunately when i open in browser something wrong. with this error message:

Failed to open a WebSocket connection: invalid Connection header: keep-alive.

You cannot access a WebSocket server directly with a browser. You need a WebSocket client.

and nerver mind i change it to 3001 and it works like charms and it solve the port conflictiong problem.