-
I have two closely related issues.
## Issue 1
I get back an empty LLM response - It's unclear to me if this is a langchain -> bedrock integration issue or entirely on the AWS side. I will try to …
-
Suggested Shell Commands:
g…
aj47 updated
2 months ago
-
-
Aider version: 0.56.0
Python version: 3.12.5
Platform: Linux-6.10.6-amd64-x86_64-with-glibc2.40
Python implementation: CPython
Virtual environment: Yes
OS: Linux 6.10.6-amd64 (64bit)
Git version…
-
When the response has `stop_reason: "max_tokens"`, the generator prompt returns an empty string. I wonder if it should raise an error? This is the response body that I raised from the gem:
```
/Us…
-
**Bug Description**
The metric `groundedness_measure_with_cot_reasons_consider_answerability` raises an error when evaluating abstention.
**To Reproduce**
```python
import boto3
from trulens.pr…
-
This will be for the `standard/llms/concrete/ShuttleAIModel`
The API KEY name in the test file should be:
SHUTTLEAI_API_KEY
Here is documentation:
https://docs.shuttleai.app/getting-started/…
-
### Issue
When using the openrouter Sonnet Model it still says to me that i have a max token output limit of 4096.
In openrouter is statet that the limit is 8192 Tokens.
### Version and model in…
-
Hello there.
I am not 100% sure if this is a mistake or if my understanding is not correct. When I run Notebook 5, I expected to see both the opening and closing XML tags in the output as I have i…
-
### System Info
OS version : macOS Sonoma 14.4
Python version: 3.10
The current version of pandasai being used: 2.2.6
### 🐛 Describe the bug
I copy pasted the code in the tutorial
```
import o…