meta-llama / llama-models

Utilities intended for use with Llama models.
Other
4.88k stars 839 forks source link

name 'ToolPromptFormat' is not defined #133

Closed AbiriAmir closed 2 months ago

AbiriAmir commented 2 months ago

I successfully downloaded the Meta-Llama3.1-8B-Instruct model and now want to run the example script:

CHECKPOINT_DIR=~/.llama/checkpoints/Meta-Llama3.1-8B-Instruct
PYTHONPATH=$(git rev-parse --show-toplevel) torchrun models/scripts/example_chat_completion.py $CHECKPOINT_DIR

But it fails with this traceback:

W0904 15:12:36.514000 8660242240 torch/distributed/elastic/multiprocessing/redirects.py:28] NOTE: Redirects are currently not supported in Windows or MacOs.
Traceback (most recent call last):
  File "/Users/amir/Projects/llama/llama-models/models/scripts/example_chat_completion.py", line 23, in <module>
    from models.llama3.reference_impl.generation import Llama
  File "/Users/amir/Projects/llama/llama-models/models/llama3/reference_impl/generation.py", line 62, in <module>
    class Llama:
  File "/Users/amir/Projects/llama/llama-models/models/llama3/reference_impl/generation.py", line 285, in Llama
    tool_prompt_format: ToolPromptFormat = ToolPromptFormat.json,
                                           ^^^^^^^^^^^^^^^^
NameError: name 'ToolPromptFormat' is not defined
E0904 15:12:37.694000 8660242240 torch/distributed/elastic/multiprocessing/api.py:833] failed (exitcode: 1) local_rank: 0 (pid: 97606) of binary: /Users/amir/Projects/python/llama/bin/python3.12
Traceback (most recent call last):
  File "/Users/amir/Projects/python/llama/bin/torchrun", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/Users/amir/Projects/python/llama/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 348, in wrapper
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/Users/amir/Projects/python/llama/lib/python3.12/site-packages/torch/distributed/run.py", line 901, in main
    run(args)
  File "/Users/amir/Projects/python/llama/lib/python3.12/site-packages/torch/distributed/run.py", line 892, in run
    elastic_launch(
  File "/Users/amir/Projects/python/llama/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 133, in __call__
    return launch_agent(self._config, self._entrypoint, list(args))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/amir/Projects/python/llama/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 264, in launch_agent
    raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
models/scripts/example_chat_completion.py FAILED
------------------------------------------------------------
Failures:
  <NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
  time      : 2024-09-04_15:12:37
  host      : 1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.ip6.arpa
  rank      : 0 (local_rank: 0)
  exitcode  : 1 (pid: 97606)
  error_file: <N/A>
  traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html

Would appreciate your help.

ashwinb commented 2 months ago

Sorry, this is a bug we introduced in our updates yesterday. Fix incoming.

ashwinb commented 2 months ago

Fixed by https://github.com/meta-llama/llama-models/commit/98550393ac2bc72910426f903ed9a5068e3319b8

Sorry about this