EdinburghNLP / awesome-hallucination-detection

List of papers on hallucination detection in LLMs.
Apache License 2.0
684 stars 57 forks source link

add minicheck for hallucination evaluation #28

Closed Liyan06 closed 2 months ago

Liyan06 commented 2 months ago

Add MiniCheck as a fact-checking model.

See fact-checking models performance leaderboard LLM-AggreFact. MiniCheck is the SOTA fact-checking model, significantly outperforming models such as SummaC and Alignscore.

pminervini commented 2 months ago

Thank you a ton!! 🙂