jy-yuan / KIVI

KIVI: A Tuning-Free Asymmetric 2bit Quantization for KV Cache
https://arxiv.org/abs/2402.02750
MIT License
121 stars 10 forks source link

LlamaConfig.attention_dropout does not exist in transformers==4.35.2 #3

Closed RalphMao closed 1 month ago

RalphMao commented 1 month ago
KIVI/models/llama_kivi.py", line 25, in __init__
    self.attention_dropout = config.attention_dropout
  File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 262, in __getattribute__
    return super().__getattribute__(key)
AttributeError: 'LlamaConfig' object has no attribute 'attention_dropout'
jy-yuan commented 1 month ago

Sorry for the outdated dependency. The correct version of transformers should be 4.36.2. We have made the corresponding update. Thanks!