artificialcitizens / acai.so

Your AI Powered Toolkit
https://acai.so
Other
14 stars 2 forks source link

Support for custom OpenAI base path #49

Closed bioshazard closed 1 year ago

bioshazard commented 1 year ago

I am hosting a llama-cpp-python[server] instance on my localhost. I request that support be added to specify a custom base path like http://localhost:8080, eg:

image

I tried to get this working myself, but initial attempt seemed to be complicated by multiple places where the OpenAI instance is instantiated. Thanks!

bioshazard commented 1 year ago

I was actually able to get the chat working with my local service after adding a baseURL override in the configuration of the ChatOpenAI instantiation in chat-model.ts, will push later.

bioshazard commented 1 year ago

btw @mabry1985 idk about you but I feel like this is delivered. I have moved to Acai as my daily chat driver with MistralOrca

mabry1985 commented 1 year ago

@bioshazard yup, thanks! Nice job here 🥇