explodinggradients / ragas

Evaluation framework for your Retrieval Augmented Generation (RAG) pipelines
https://docs.ragas.io
Apache License 2.0
5.66k stars 529 forks source link

Context Precision Example #1036

Open MatthewSH opened 2 weeks ago

MatthewSH commented 2 weeks ago

[x] I checked the documentation and related resources and couldn't find an answer to my question.

Your Question I was reviewing the context precision example on the website and noticed that answer is being provided as an input even though it's not being used actively as far as we can tell in the context precision metric. We even see a couple spots where the answer key is being overwritten with the ground_truth key.

My question is, for the examples are they supposed to be minimal and required arguments or just a data dump of everything that can be provided?

I know that using answer is going to be deprecated soon and according to https://github.com/explodinggradients/ragas/blob/main/src/ragas/metrics/_context_precision.py#L117 it will fall back, I was just curious if it having both on the example is intended or not.

EDIT: Also if the answer version is being deprecated soon, is the plan to switch all references from answer to ground_truth?

dosubot[bot] commented 2 weeks ago

Hello @MatthewSH! I'm a bot here to help you with any issues or questions you have. I can assist with bug fixes, answer inquiries, and guide you on becoming a contributor. How can I assist you today?

The examples for context precision are supposed to be minimal and required arguments, not a data dump of everything that can be provided. The provided examples focus on the necessary inputs: question, context, and answer, and demonstrate how the context was useful in arriving at the given answer [1][2].

To continue talking to Dosu, mention @dosu.