Open actus-1 opened 1 year ago
Hi @actus-1 @zorin-mv - I believe we can make this easier I’m the maintainer of LiteLLM - we allow you to deploy an LLM proxy to call 100+ LLMs in 1 format - Bedrock, OpenAI, Anthropic etc https://github.com/BerriAI/litellm/tree/main/openai-proxy.
If this looks useful (we're used in production)- please let me know how we can help.
Bedrock request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "bedrock/anthropic.claude-instant-v1",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
gpt-3.5-turbo request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
claude-2 request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "claude-2",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
I would like to request the addition of support for the Anthropic Claude API when deployed on Bedrock AWS. This enhancement will allow users to leverage the Claude API's capabilities within the anthropic-gui, while utilizing Bedrock's AWS infrastructure for deployment and scaling. It could significantly broaden the use cases and improve the usability of the anthropic-gui.
Looking forward to the community's thoughts on this. Thank you!