opea-project / GenAIEval

Evaluation, benchmark, and scorecard, targeting for performance on throughput and latency, accuracy on popular evaluation harness, safety, and hallucination
Apache License 2.0
22 stars 40 forks source link

doc: cruft at top of /evals/metrics/bleu/README.md #63

Open dbkinder opened 3 months ago

dbkinder commented 3 months ago

There's some material at the top of /evals/metrics/bleu/README.md that's not needed in our environment and is disturbing document generation. The document should begin with an H1 heading and the rest before this should be deleted.

# Metric Card for BLEU
chickenrae commented 1 month ago

Should this be closed?