-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain.js documentation with the integrated search.
- [X] I used the GitHub search to find a …
-
When attempting to create a new Anthropic agent with the latest model, I received this error:
⚠️ Error: The provided configuration does not result in a working agent. The following error was enco…
-
```python
from edsl import Model
import time
models_list = [['Austism/chronos-hermes-13b-v2', 'deep_infra', 0], ['BAAI/bge-base-en-v1.5', 'together', 1], ['BAAI/bge-large-en-v1.5', 'together', …
-
For now, Claude 3.5 Sonnet (v2) can only be used through an inference profile.
We should use `us.anthropic.claude-3-5-sonnet-20241022-v2:0` rather than `anthropic.claude-3-5-sonnet-20241022-v2:0`.
…
-
In trying to get a better picture of how many tokens each message that Claude sends needs during computer use I am trying to use the new token count endpoint which is right now in beta.
This works …
-
> # RFC: Optional Enclave Pattern for Computer Use Demo
> - Authors: Claude (@anthropic), @jwalsh
> - Reviewers: @seanjensengrey
> - Status: DRAFT
> - Created: 2024-11-01
## Abstract
This RFC …
-
### 🥰 需求描述
aws bedrock rolling out cross-region inference for Claude 3.5 and some Llama models. it will have compatibility issues with current lobe.env.
### 🧐 解决方案
designated the region lik…
-
### What would you like to see?
It would be helpful to support prompt caching for Claude 3.5 Sonnet. This not only helps cut down on LLM costs, but also latency.
https://docs.anthropic.com/en/doc…
-
https://docs.anthropic.com/en/docs/build-with-claude/pdf-support
> The new Claude 3.5 Sonnet (`claude-3-5-sonnet-20241022`) model now supports PDF input and understands both text and visual content…
-
As title says. Please consider adding support to toggle Anthropic's prompt caching on and off. Especially with long notes or multiple notes being referenced in a static fashion, this would save a lot …