bentoml / OpenLLM

Run any open-source LLMs, such as Llama, Mistral, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
10.14k stars 641 forks source link

Deploying LLM in On-Premises Server to Assist Users to Launch Locally in Work Laptop - Web Browser #934

Open sanket038 opened 8 months ago

sanket038 commented 8 months ago

Feature request

I have been searching through a lot of websites and watching youtube videos on how to deploy opensource LLM models locally on a windows server and then it could be further exposed to the users who can interact with the LLM to ask questions using their own laptop's web browser. I believe this could be acheived using openllm however, I am not sure if this is already included in the library.

Motivation

No response

Other

No response

VISWANATH78 commented 7 months ago

Have you find an way @sanket038 . Even i am in search of how to host the openllm from my working server and then making api calls from the server . any idea on hosting the openllm from the server . IF so please help me out.

euroblaze commented 7 months ago

Try to look at something like Ollama. (And let us know if that's what you seek.)

VISWANATH78 commented 7 months ago

do u know the steps to link my custom downloaded model to be linked with ollama and then serve as an api to everyone. where i have deployment an chatbot ui i need to have backend code as the api which can be accessed by entire members.like ui in multiple device piging the server like that. @euroblaze . If you have discord please let me know we can connect send me the invite link to this mail newtech1106@gmail.com.