explodinggradients / ragas

Evaluation framework for your Retrieval Augmented Generation (RAG) pipelines
https://docs.ragas.io
Apache License 2.0
6.54k stars 641 forks source link

cannot import name 'context_relevancy' from 'ragas.metrics' #1210

Open Zabih-khan opened 3 weeks ago

Zabih-khan commented 3 weeks ago

from ragas.metrics.critique import harmfulness
from ragas import evaluate

from ragas.metrics import (
    answer_relevancy,
    faithfulness,
    context_recall,
    context_precision,
    context_relevancy,
    answer_correctness,
    answer_similarity
)

def create_ragas_dataset(rag_pipeline, eval_dataset):
  rag_dataset = []
  for row in tqdm(eval_dataset):
    answer = rag_pipeline.invoke({"question" : row["question"]})
    rag_dataset.append(
        {"question" : row["question"],
         "answer" : answer["response"].content,
         "contexts" : [context.page_content for context in answer["context"]],
         "ground_truths" : [row["ground_truth"]]
         }
    )
  rag_df = pd.DataFrame(rag_dataset)
  rag_eval_dataset = Dataset.from_pandas(rag_df)
  return rag_eval_dataset

def evaluate_ragas_dataset(ragas_dataset):
  result = evaluate(
    ragas_dataset,
    metrics=[
        context_precision,
        faithfulness,
        answer_relevancy,
        context_recall,
        context_relevancy,
        answer_correctness,
        answer_similarity
    ],
  )
  return result

When i run the above code I give me this error.

ImportError: cannot import name 'context_relevancy' from 'ragas.metrics' (c:\Users\khan\anaconda3\Lib\site-packages\ragas\metrics__init__.py)

dosubot[bot] commented 3 weeks ago

Hey @Zabih-khan! I'm here to help you with any bugs, questions, or contributions. Let's tackle this issue together!

The ImportError is occurring because there is no context_relevancy in the ragas.metrics module. You should replace context_relevancy with the correct import, which could be AnswerRelevancy or answer_relevancy.

Here's the corrected import statement:

from ragas.metrics import (
    answer_relevancy,
    faithfulness,
    context_recall,
    context_precision,
    answer_correctness,
    answer_similarity
)

And update the evaluate_ragas_dataset function accordingly:

def evaluate_ragas_dataset(ragas_dataset):
  result = evaluate(
    ragas_dataset,
    metrics=[
        context_precision,
        faithfulness,
        answer_relevancy,
        context_recall,
        answer_correctness,
        answer_similarity
    ],
  )
  return result

This should resolve the import error [1].

To continue talking to Dosu, mention @dosu.

Zabih-khan commented 3 weeks ago

thank you

dsporcaro commented 2 weeks ago

Where did it go? I'm using a library (langevals) that references it: https://github.com/langwatch/langevals/blob/main/evaluators/ragas/langevals_ragas/context_relevancy.py

I also found some documentation on context relevancy as a metric:

https://docs.ragas.io/en/v0.1.6/concepts/metrics/context_relevancy.html

jjmachan commented 1 week ago

@dsporcaro @Zabih-khan this metric has been removed in #1111 do check out the coms there for more info 🙂 s

dsporcaro commented 1 week ago

Thank you. This is helpful context!

On Sat, Aug 31, 2024, 2:22 AM Jithin James @.***> wrote:

@dsporcaro https://github.com/dsporcaro @Zabih-khan https://github.com/Zabih-khan this metric has been removed in #1111 https://github.com/explodinggradients/ragas/pull/1111 do check out the coms there for more info 🙂 s

— Reply to this email directly, view it on GitHub https://github.com/explodinggradients/ragas/issues/1210#issuecomment-2322826235, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACCLKQ2HSTCTXQQBVTHRMU3ZUF4LDAVCNFSM6AAAAABM5PMS2OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMRSHAZDMMRTGU . You are receiving this because you were mentioned.Message ID: @.***>

Zabih-khan commented 1 week ago

Thank you all

On Sat, Aug 31, 2024 at 6:30 AM dsporcaro @.***> wrote:

Thank you. This is helpful context!

On Sat, Aug 31, 2024, 2:22 AM Jithin James @.***> wrote:

@dsporcaro https://github.com/dsporcaro @Zabih-khan https://github.com/Zabih-khan this metric has been removed in #1111 https://github.com/explodinggradients/ragas/pull/1111 do check out the coms there for more info 🙂 s

— Reply to this email directly, view it on GitHub < https://github.com/explodinggradients/ragas/issues/1210#issuecomment-2322826235>,

or unsubscribe < https://github.com/notifications/unsubscribe-auth/ACCLKQ2HSTCTXQQBVTHRMU3ZUF4LDAVCNFSM6AAAAABM5PMS2OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMRSHAZDMMRTGU>

. You are receiving this because you were mentioned.Message ID: @.***>

— Reply to this email directly, view it on GitHub https://github.com/explodinggradients/ragas/issues/1210#issuecomment-2322898021, or unsubscribe https://github.com/notifications/unsubscribe-auth/A6DVOTPVI26MWJFTAGCFENLZUHAQZAVCNFSM6AAAAABM5PMS2OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMRSHA4TQMBSGE . You are receiving this because you were mentioned.Message ID: @.***>