Arize-ai / phoenix

AI Observability & Evaluation
https://docs.arize.com/phoenix
Other
3.32k stars 237 forks source link

add evals annotations to DSPy notebook #2299

Open axiomofjoy opened 6 months ago

axiomofjoy commented 6 months ago

DSPy provides its own set of evaluation methods for evaluating compiled DSPy modules on dev sets, e.g., exact answer match and relevance. We can add these evaluations as annotations via log_evaluations.

dosubot[bot] commented 6 months ago

🤖

That sounds like a great addition to the DSPy notebook! Your proposed solution to add evals annotations via log_evaluations would be a valuable enhancement. We encourage you to go ahead and create a pull request with your changes. Your contribution will be much appreciated by the community. Thank you for your willingness to improve the repository!

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.