-
### Issue
I hope it will be available not only through Anthropic but also through Amazon Bedrock.
https://aws.amazon.com/blogs/aws/anthropics-claude-3-5-sonnet-model-now-available-in-amazon-bedrock-…
-
使用docker部署,系统是Ubuntu 22.04.5 LTS
使用了相同的api后端,在沉浸翻译中直接使用api后端,并发能达到每秒十多次(一片一片的出来翻译结果)
但是经过uni-api后大概每秒只能输出一次翻译结果,日志中也是差不多每秒一次出现请求
服务器到本地测速上下行都有一两百,延迟稳定150ms左右,美西优化。
使用了长亭雷池 WAF 防火墙,但是关闭了对应uni-api接口…
-
### 📦 部署方式
官方安装包
### 📌 软件版本
2.15.1
### 💻 系统环境
Windows, macOS
### 📌 系统版本
Sonama 14.5
### 🌐 浏览器
Chrome, Safari
### 📌 浏览器版本
safari17.5
### 🐛 问题描述
当自定义模型名称和已存在模型名称重复时,会强制使用模型列表中对应模型的 api 格式。比…
-
### Self Checks
- [X] This is only for bug report, if you would like to ask a question, please head to [Discussions](https://github.com/langgenius/dify/discussions/categories/general).
- [X] I have s…
-
### Opencommit Version
3.0.3
### Node Version
v18.7.0
### NPM Version
9.8.1
### What OS are you seeing the problem on?
Other Linux Distro
### What happened?
stuck on
![image](https://github…
-
### How are you running AnythingLLM?
AnythingLLM desktop app
### What happened?
As the title and followed picture indicated. I had tried the same question in ollama, qwen worked well (fig. 1). This…
-
I'm using `AnthropicBedrock` and `model=eu.anthropic.claude-3-5-sonnet-20240620-v1:0`. I want to use `max_tokens=8192`. However there is an implicit cap of 4096 tokens.
While boto3 will throw an er…
-
### Feature Description
Support Support Claude 3.5 Sonnet as Multimodal LLM in llama-index-multi-modal-llms-anthropic
### Reason
llama-index-multi-modal-llms-anthropic does not support Claude 3.5 S…
IngLP updated
2 weeks ago
-
### What happened?
Anthropic bedrock models don't seem to work if we pass in the `max_tokens`,
```python
from litellm import completion
completion(
"bedrock/us.anthropic.claude-3-5-sonnet-2…
-
### Describe the feature
Anthropic have introduced [Prompt Caching for Claude](https://www.anthropic.com/news/prompt-caching) which allows significantly faster inference. Adding support for this wo…