-
I just tried the brand new @aws-appsync/utils/ai converse and invokeModel helpers to write a quick Bedrock invocation resolver. Here's a prototype code:
```typescript
import { type Context } from …
-
Add Claude 3 Support
-
内容如下:
你好!我是 Claude 3.5 Sonnet,是由 Anthropic 开发的 AI 助手。我可以用中文为你提供帮助,特别擅长编程相关的问题。你有什么具体的问题需要我帮忙吗?
-
**例行检查**
[//]: # (方框内删除已有的空格,填 x 号)
+ [x] 我已确认目前没有类似 issue
+ [x] 我已确认我已升级到最新版本
+ [x] 我已完整查看过项目 README,尤其是常见问题部分
+ [x] 我理解并愿意跟进此 issue,协助测试和提供反馈
+ [x] 我理解并认可上述内容,并理解项目维护者精力有限,**不遵循规则的 issue 可能…
63936 updated
1 month ago
-
**Describe the bug**
When using Llama instrumenter and calling BedrockConverse, all expected spans are picked up but when switching to llama-index-llms-anthropic then LLM spans go missing.
**To Re…
-
### What happened?
When working with Aider & the new Claude Connet 3.5 v2 model in Bedrock, Litellm will sometimes hang for about 5 minutes or so and then give me the following error about 10 - 15 mi…
-
Type: Bug
Triyng to use Claude 3.5 Sonnet has not been working for me for at least a week
Extension version: 0.22.4
VS Code version: Code 1.95.3 (Universal) (f1a4fb101478ce6ec82fe9627c43efbf9e98c81…
-
Using Claude 3.5...
![image](https://github.com/user-attachments/assets/ce558f3e-93ea-4f64-93dd-41db7a6c3132)
![image](https://github.com/user-attachments/assets/35b0461b-c2c0-4251-8d5a-a11364be…
-
### Have you searched for similar requests?
Yes
### Is your feature request related to a problem? If so, please describe.
_No response_
### Describe the solution you'd like
As title, the goal is …
-
### The Feature
We would like a better way to handle error responses from the litellm proxy. Currently we get errors like this from the api:
```json
{
"error":{
"code":"400",
"message"…