aws / fmeval

Foundation Model Evaluations Library
http://aws.github.io/fmeval
Apache License 2.0
151 stars 40 forks source link

feat: add answer relevance algo #295

Open xiaoyi-cheng opened 1 week ago

xiaoyi-cheng commented 1 week ago

Issue #, if available: Add answer relevance metric under the AnswerRelevance evaluation algorithm.

Description of changes:

  1. Default dataset unclear.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

xiaoyi-cheng commented 9 hours ago

@polaschwoebel : The idea behind these is to group new metrics that taking the same input signature for evaluate_sample as one new algo. qa_accuracy metrics are calculated based on target_output and model_output, in that algo model_input is not needed. Adding this metric to qa_accuracy will need to modify the input signature of it.