Closed alesak23 closed 5 months ago
Hi @alesak23,
Is the base_url
parameter of the Anthropic
client or the AnthropicBedrock
client what you are looking for?
It's more complicated. We call https://llmgateway.company.com/claude-3-sonnet-20240229-v1, but if I set it as base URL with Anthropic client, it calls https://llmgateway.company.com/claude-3-sonnet-20240229-v1/v1/messages. So we have to add another endpoint, but that's probably fine
In addition, Bedrock requires anthropic_version in body, which Anthropic client doesn't allow, but we can add that in our lambda
We cannot use AnthropicBedrock client, because it looks for aws keys and there is no way around it
Could you use the Anthropic
client, add /v1/messages
path, and send extra_body={'anthropic_version': '…'}
in your requests?
Hello, I'm working for global corporation and all requests to LLMs are routed through company API. So for Claude3 we would use for example:
https://llmgateway.company.com/claude-3-sonnet-20240229-v1
Then it calls AWS Bedrock (and/or maybe Vertex in the future) in the backend. This works fine with requests in Python, but it's not compatible with this SDK.
OpenAI library has this configurable. Is it possible to add this option here? Thank you