Closed silarsis closed 1 year ago
Thanks for the issue, @silarsis . That sounds very reasonable to me and agree it should be straightforward. We're going to group up first thing in the morning to discuss the best way to expose these configurations and a pr.
async-openai
ref: https://github.com/64bit/async-openai/pull/67
@silarsis The OpenAI Rust client that we use is actively working on exposing this ability through the client: https://github.com/64bit/async-openai/pull/67
If you're hoping to kick around motorhead more immediately, please shoot me an email at james@getmetal.io and we can discuss options (eg point to the commit sha in the PR above). In either case, i'd be happy to dig in on your particular use-case and see if we can help in any way
I'm trying to build something in an enterprise environment where we have our own Azure OpenAI instances and in-house OpenAI equivalent API - Motorhead doesn't pass through things like base_url and deployment_name and such, so I can't use our own APIs. I think it would be relatively straightforward to pass them into the openai instance constructor, but I don't know rust so not clear on how to build a PR for this one.