when using transformers verison 4.35.2, I got this error, and similar error for quanting llama:
it seems you are using version <=4.33.3, can you support newer transformers version?
OmniQuant-main/models/int_falcon_layer.py", line 148, in forward
query_layer, key_layer = self.maybe_rotary(query_layer, key_layer, past_kv_length)
File "lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
TypeError: FalconRotaryEmbedding.forward() missing 1 required positional argument: 'position_ids'
when using transformers verison 4.35.2, I got this error, and similar error for quanting llama: it seems you are using version <=4.33.3, can you support newer transformers version?
OmniQuant-main/models/int_falcon_layer.py", line 148, in forward query_layer, key_layer = self.maybe_rotary(query_layer, key_layer, past_kv_length) File "lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl return forward_call(*args, **kwargs) TypeError: FalconRotaryEmbedding.forward() missing 1 required positional argument: 'position_ids'