citadel-ai / langcheck

Simple, Pythonic building blocks to evaluate LLM applications.
https://langcheck.readthedocs.io/en/latest/index.html
MIT License
184 stars 17 forks source link

Update prompts to output the chain-of-thought reasoning first #118

Closed yosukehigashi closed 4 months ago

yosukehigashi commented 4 months ago

Adds the phrase Output your thought process first, and then provide your final answer. to the prompts (in all languages) to ensure that the LLM evaluators actually does chain-of-thought reasoning.