It appears that the latest versions of transformers (4.43.*) do not play nicely with GPT2 on MacOS-ARM. This is seen in our PR Gate, with errors:
FAILED tests/model_integration/library/test_gen.py::test_stop_list_side_effect - RuntimeError: probability tensor contains either `inf`, `nan` or element < 0
FAILED tests/model_integration/library/test_gen.py::test_unicode - RuntimeError: probability tensor contains either `inf`, `nan` or element < 0
FAILED tests/model_integration/library/test_gen.py::test_pattern_kleene - AssertionError: assert False
+ where False = <built-in method startswith of str object at 0x11ebb6c70>('!!!!!!!!!!')
+ where <built-in method startswith of str object at 0x11ebb6c70> = 'stic!!!!!!!!!!'.startswith
FAILED tests/model_integration/library/test_gen.py::test_tool_call - Exception: Attempted to run a transformers model past its maximum context window size of 1024!
I have been working around with exclusions in setup.py but it would be good if someone with a MacOS-ARM machine could dig into this.
The bug
It appears that the latest versions of
transformers
(4.43.*) do not play nicely with GPT2 on MacOS-ARM. This is seen in our PR Gate, with errors:I have been working around with exclusions in
setup.py
but it would be good if someone with a MacOS-ARM machine could dig into this.To Reproduce
See e.g.: https://github.com/guidance-ai/guidance/actions/runs/10061999864/job/27815616390
System info (please complete the following information):