-
hi when i run demo.ipynb
with open('/root/llm-reasoners/examples/CoT/blocksworld/prompts/pool_prompt_v1.json') as f:
prompt = json.load(f) #/root/llm-reasoners/examples
evaluator …
-
[ ] I checked the [documentation](https://docs.ragas.io/) and related resources and couldn't find an answer to my question.
**Your Question**
Getting Error: **AttributeError: 'Generation' object h…
-
[ ] I checked the [documentation](https://docs.ragas.io/) and related resources and couldn't find an answer to my question.
**Your Question**
Requesting help. I started running ragas last week and…
-
@haileyschoelkopf @lintangsutawika @baberabb
The following is a list of TODOs to implement LLM-as-a-Judge in Eval-Harness:
**TLDR**
* Splits existing `evaluate` function into `classification_e…
-
[X] I have checked the [documentation](https://docs.ragas.io/) and related resources and couldn't resolve my bug. Yes, there are no recommendations on how to fix the api connection errors, especially …
-
**Is your feature request related to a problem? Please describe.**
Sometimes prompts and inputs result in unpredictable LLM behaviour, especially at higher temperatures. This means that both the LLM …
-
-
I'm opening this issue to discuss about what we think the "LLM task" framework should aim to be, and how we could incrementally get there.
## What we have today
Today, what we call the "task framewo…
-
### Bug Description
So I want to get the metrics from RAGAS and whenever I dowwnload model locally in my EC2 the metrics gives me the answer biut when I use sagemaker end point as a LLM it thros er…
-
**Describe the bug**
I am trying to run the code in [LLM Optimization with DirectML](https://github.com/microsoft/Olive/tree/d068cf963e0ec1c5d4ecf9b87c2a1e118fe2303f/examples/directml/llm). The `requ…