Closed MurphyLo closed 4 months ago
Hi Murphy, 请问您正在开发这个功能吗?我这边也有需求,我可以帮忙一起弄
可以使用 litellm 中转 mistral 的 api 后,在 one-api 上设置代理地址为本地 litellm 端口。具体使用的是 LiteLLM 的 OpenAI Proxy Server,参考链接:
需求+1
需求+1
Definitely should support this, but I need a key for testing purpose.
Definitely should support this, but I need a key for testing purpose.
我可以提供,但如何联系你呢?
需求+1,现在AWS bedrock 有claude3,使用非常方便,希望可以添加这个功能 Requirement +1, now AWS bedrock has claude3, which is very easy to use. Hope this feature can be added.
Definitely should support this, but I need a key for testing purpose.
Are you working on it? I got key and I'd be glad to dive in
需求+1,AWS的bedrock 功能可以参考llm,https://docs.litellm.ai/docs/providers/bedrock
需求+1,目前开发进度如何,有开发分支吗?我可以协助一起开发
需求+1
+1
需求+1
需求+1
例行检查
功能描述
Provide access to the models hosted at AWS Bedrock using IAM profiles and keys. Since Anthropic doesn't give out many Claude API keys, this is a viable alternative to people wanting Claude.
Reference