openai / evals

Evals is a framework for evaluating LLMs and LLM systems, and an open-source registry of benchmarks.
Other
14.36k stars 2.55k forks source link

Tagged Release For 2.0.0 #1456

Closed michaelAlvarino closed 5 months ago

michaelAlvarino commented 5 months ago

Describe the bug

Hey all, I saw that there was a release recently to pypi: https://pypi.org/project/evals/ for version 2.0.0. I was wondering if it was easy to make a tagged commit in Github like you did for 1.0.3 (https://github.com/openai/evals/tags). I'm not sure if it's difficult to do in hindsight, but it does make it slightly easier to navigate and brows the code at a specific release.

Thanks!

To Reproduce

  1. Go to https://github.com/openai/evals/tags
  2. There is no tag for 2.0.0

Code snippets

N/A

OS

N/A

Python version

N/A

Library version

openai-evals v2.0.0 and openai-evals v2.0.0.post1

etr2460 commented 5 months ago

The release is tagged now, sorry for the miss! https://github.com/openai/evals/releases/tag/2.0.0