gorilla-llm / gorilla-cli

LLMs for your CLI
https://gorilla.cs.berkeley.edu/
Apache License 2.0
1.22k stars 73 forks source link

Self Hosting Inference API #6

Closed Extremys closed 11 months ago

Extremys commented 1 year ago

Hello, great cli app! Would it be possible to release the inference API source code if not already done, to be able to self host the inference API? Regards.

ShishirPatil commented 1 year ago

Hey @Extremys Thank you for you kind words and yes indeed. By self-hosting you mean, you want an end point you want to hit for inference? We have a hosted inference, that you can hit.

response = openai.ChatCompletion.create(
                api_base = "http://34.132.127.197:8000/v1",
                model='gorilla-bash-v0',
                temperature=0.1,
                messages=messages,
)

Hope this helps. Let me know if you have any other questions!

arashilmg commented 10 months ago

No He means he wants the model + API implementation of it, to host it on his own environment

Extremys commented 10 months ago

No He means he wants the model + API implementation of it, to host it on his own environment

Yes exactly :) with the API service