Closed bioshazard closed 1 year ago
I was actually able to get the chat working with my local service after adding a baseURL
override in the configuration
of the ChatOpenAI instantiation in chat-model.ts, will push later.
btw @mabry1985 idk about you but I feel like this is delivered. I have moved to Acai as my daily chat driver with MistralOrca
@bioshazard yup, thanks! Nice job here 🥇
I am hosting a
llama-cpp-python[server]
instance on my localhost. I request that support be added to specify a custom base path likehttp://localhost:8080
, eg:I tried to get this working myself, but initial attempt seemed to be complicated by multiple places where the OpenAI instance is instantiated. Thanks!