Closed krrishdholakia closed 1 year ago
Tracking 20 repos that might find this helpful:
updates:
How would this be possible ? When you set a url for OpenAI base it expects a server
openai.api_base = "https://localhost:500"
The proposed interface there is a function :
openai.api_base = litellm.translate_api_call(custom_api_base, translate_function)
Open to suggestions on this
Tried doing something like this:
def custom_function():
print("custom function called")
openai.api_base = custom_function
but openai calls @app.route('/chat/completions', methods=["POST"])
closing this
There’s a lot of different LLM deployment providers. How do I easily replace my OpenAI base with their url as a proxy? - https://github.com/petals-infra/chat.petals.dev/issues/20, https://www.banana.dev/, etc.