Closed ianu82 closed 1 month ago
Hey @ianu82 you can do this with LiteLLM
Via OpenAI Proxy:
$ litellm --model bedrock/anthropic.claude-v2
# UVICORN: OpenAI Compatible Endpoint running on http://0.0.0.0:8000
CREATE MODEL model_name
PREDICT column_to_be_predicted
USING
engine = 'openai',
model_name = 'openai_model_name'
api_key = 'my-fake-key',
api_base = 'http://0.0.0.0:8000';
We're working on our deployed proxy right now - https://github.com/BerriAI/litellm/tree/main/openai-proxy; I'd appreciate any feedback you might have here!
Short description and motivation for the proposed feature
Be able to use AWS Bedrock models from within MindsDB
Video or screenshots
No response
Describe some possible solutions
No response
Anything else?
No response