Closed stevensu1977 closed 1 year ago
Hey @stevensu1977
Use LiteLLM for spinning up OpenAI-compatible server that translates openai/non-openai calls.
https://docs.litellm.ai/docs/proxy_server
litellm --model bedrock/anthropic.claude-instant-v1
# OPENAI Compatible Server running on http://0.0.0.0:8000
k8sgpt auth add --backend localai --model bedrock/anthropic.claude-instant-v1 --baseurl http://0.0.0.0:8000
Hi @krrishdholakia Thanks your comments , LiteLLM is good idea , it's a great project , but I just think if k8sgpt native support amazon bedrock that's be independently , like other k8s tools .
Checklist
Is this feature request related to a problem?
None
Problem Description
I use AWS EKS and want use Amazon Bedrock Claude model, so I fork repo and write it.
Solution Description
Amazon Bedrock AI provider
Benefits
If you use AWS EKS, you can use k8sgpt from the Amazon Bedrock AI provider, and your data will only be kept in the Amazon VPC
Potential Drawbacks
No response
Additional Information
No response