Closed PhilipMay closed 3 months ago
i quick workaround: uncomment from openai import OpenAI in lighteval/src/lighteval/metrics/llm_as_judge.py and do pip install e . again :)
i quick workaround: uncomment from openai import OpenAI in lighteval/src/lighteval/metrics/llm_as_judge.py and do pip install e . again :)
Yes sure. Thanks. This is a workaround and I also applied it. But nevertheless - we have a bug that should be fixed.
Either the openai package should be installed by default or it should not be imported by default.
Hi! Yes, we need to add it to our optional dependencies with a check, it's already in the works in #173
hi! since the llm-as-judge
metric is an official metric, we will be adding openai
as a required dependency. like clementine said a PR has been opened :)
This error is annoying. Who is even using closedai if there is prometheus for evaluation?
as we now allow to use transformers for llm as judge, we no longer require openai
When I install lighteval form main and call this command:
I get this error:
This should not happen since I do not want to use anything from OpenAI.