guidance-ai / guidance

A guidance language for controlling large language models.
MIT License
18.69k stars 1.03k forks source link

[Bug] Latest Transformers disagrees with GPT2 on MacOS-ARM #965

Open riedgar-ms opened 1 month ago

riedgar-ms commented 1 month ago

The bug

It appears that the latest versions of transformers (4.43.*) do not play nicely with GPT2 on MacOS-ARM. This is seen in our PR Gate, with errors:

FAILED tests/model_integration/library/test_gen.py::test_stop_list_side_effect - RuntimeError: probability tensor contains either `inf`, `nan` or element < 0
FAILED tests/model_integration/library/test_gen.py::test_unicode - RuntimeError: probability tensor contains either `inf`, `nan` or element < 0
FAILED tests/model_integration/library/test_gen.py::test_pattern_kleene - AssertionError: assert False
 +  where False = <built-in method startswith of str object at 0x11ebb6c70>('!!!!!!!!!!')
 +    where <built-in method startswith of str object at 0x11ebb6c70> = 'stic!!!!!!!!!!'.startswith
FAILED tests/model_integration/library/test_gen.py::test_tool_call - Exception: Attempted to run a transformers model past its maximum context window size of 1024!

I have been working around with exclusions in setup.py but it would be good if someone with a MacOS-ARM machine could dig into this.

To Reproduce

See e.g.: https://github.com/guidance-ai/guidance/actions/runs/10061999864/job/27815616390

System info (please complete the following information):

riedgar-ms commented 1 month ago

@Harsha-Nori can't reproduce.... @hudson-ai or @nopdive , could you try on your machines, please?

hudson-ai commented 1 month ago

Cannot repro on Sonoma with an M3 Max and transformers==4.43.1...