-
**Describe the bug**
Cannot get a complete response from Claude 3. It will almost always stop before its finished.
**Desktop (please complete the following information):**
- OS: Windows 11
- B…
-
With the most recent version it appears that ChatBedrock does not function when I call using a llama3 model using chain.ainvoke({}).
```python
\langchain_aws\llms\bedrock.py", line 867, in _aprepa…
-
### System Info
OS version : macOS Sonoma 14.4
Python version: 3.10
The current version of pandasai being used: 2.2.6
### 🐛 Describe the bug
I copy pasted the code in the tutorial
```
import o…
-
I am trying to set up bedrock-claude-chat and everything seems fine. After running the required commands in my EC2, there's output of the front end page and I can able to connect to the front end web …
-
Hi everyone. As you know some latest Anthropic's Claude models is better and chaeper than OpenaAI models. So I create a new generator file to use Anthropic's models. and made changes to manager.py fil…
-
-
all options are for claude v1.3 , would you like to add the SotA claude v3
-
### Describe the bug
While using Open Interpreter, I noticed that the built-in system prompts do not function properly when the Claude model (specified via the OpenAI format) is used. In contrast, wh…
-
I was not able to see the claude 3 opus or other claude llm model selections in intellij. Can you let me know what i have to do in order to get them listed.
[Screencast from 2024-04-10 09-11-16.web…
-
from langchain_aws import BedrockLLM
llm = BedrockLLM(model_id="anthropic.claude-3-sonnet-20240229-v1:0", region_name='us-east-1')