huggingface / lighteval

Lighteval is your all-in-one toolkit for evaluating LLMs across multiple backends
MIT License
810 stars 96 forks source link

Error: `ModuleNotFoundError: No module named 'openai'`. #175

Closed PhilipMay closed 3 months ago

PhilipMay commented 6 months ago

When I install lighteval form main and call this command:

accelerate launch --num_processes=1 run_evals_accelerate.py \
    --model_args "vonjack/Phi-3-mini-4k-instruct-LLaMAfied" \
    --tasks tasks_examples/open_llm_leaderboard_tasks.txt \
    --override_batch_size 1 \
    --use_chat_template \
    --output_dir="./evals/"

I get this error:

Traceback (most recent call last):
  File "/users/philip/code/git/lighteval_main/run_evals_accelerate.py", line 29, in <module>
    from lighteval.main_accelerate import CACHE_DIR, main
  File "/users/philip/code/git/lighteval_main/src/lighteval/main_accelerate.py", line 31, in <module>
    from lighteval.evaluator import evaluate, make_results_table
  File "/users/philip/code/git/lighteval_main/src/lighteval/evaluator.py", line 32, in <module>
    from lighteval.logging.evaluation_tracker import EvaluationTracker
  File "/users/philip/code/git/lighteval_main/src/lighteval/logging/evaluation_tracker.py", line 37, in <module>
    from lighteval.logging.info_loggers import (
  File "/users/philip/code/git/lighteval_main/src/lighteval/logging/info_loggers.py", line 34, in <module>
    from lighteval.metrics import MetricCategory
  File "/users/philip/code/git/lighteval_main/src/lighteval/metrics/__init__.py", line 25, in <module>
    from lighteval.metrics.metrics import MetricCategory, Metrics
  File "/users/philip/code/git/lighteval_main/src/lighteval/metrics/metrics.py", line 34, in <module>
    from lighteval.metrics.metrics_sample import (
  File "/users/philip/code/git/lighteval_main/src/lighteval/metrics/metrics_sample.py", line 42, in <module>
    from lighteval.metrics.llm_as_judge import JudgeOpenAI
  File "/users/philip/code/git/lighteval_main/src/lighteval/metrics/llm_as_judge.py", line 30, in <module>
    from openai import OpenAI
ModuleNotFoundError: No module named 'openai'
Traceback (most recent call last):
  File "/users/philip/miniconda3/envs/lighteval_main/bin/accelerate", line 8, in <module>
    sys.exit(main())
  File "/users/philip/miniconda3/envs/lighteval_main/lib/python3.10/site-packages/accelerate/commands/accelerate_cli.py", line 46, in main
    args.func(args)
  File "/users/philip/miniconda3/envs/lighteval_main/lib/python3.10/site-packages/accelerate/commands/launch.py", line 1075, in launch_command
    simple_launcher(args)
  File "/users/philip/miniconda3/envs/lighteval_main/lib/python3.10/site-packages/accelerate/commands/launch.py", line 681, in simple_launcher
    raise subprocess.CalledProcessError(returncode=process.returncode, cmd=cmd)
subprocess.CalledProcessError: Command '['/users/philip/miniconda3/envs/lighteval_main/bin/python', 'run_evals_accelerate.py', '--model_args', 'vonjack/Phi-3-mini-4k-instruct-LLaMAfied', '--tasks', 'tasks_examples/open_llm_leaderboard_tasks.txt', '--override_batch_size', '1', '--use_chat_template', '--output_dir=./evals/']' returned non-zero exit status 1.

This should not happen since I do not want to use anything from OpenAI.

dgolchin commented 6 months ago

i quick workaround: uncomment from openai import OpenAI in lighteval/src/lighteval/metrics/llm_as_judge.py and do pip install e . again :)

PhilipMay commented 6 months ago

i quick workaround: uncomment from openai import OpenAI in lighteval/src/lighteval/metrics/llm_as_judge.py and do pip install e . again :)

Yes sure. Thanks. This is a workaround and I also applied it. But nevertheless - we have a bug that should be fixed.

Either the openai package should be installed by default or it should not be imported by default.

clefourrier commented 6 months ago

Hi! Yes, we need to add it to our optional dependencies with a check, it's already in the works in #173

NathanHB commented 6 months ago

hi! since the llm-as-judge metric is an official metric, we will be adding openai as a required dependency. like clementine said a PR has been opened :)

Bachstelze commented 6 months ago

This error is annoying. Who is even using closedai if there is prometheus for evaluation?

NathanHB commented 3 months ago

as we now allow to use transformers for llm as judge, we no longer require openai