Open BenjaminBossan opened 5 days ago
This import is failing with the latest transformers version:
https://github.com/NetEase-FuXi/EETQ/blob/81e0b14d64088d58ef6acd2c8f3e788d59324407/python/eetq/models/base.py#L13
I checked how others solved this and it looks like in AWQ, this import could be removed by using the save_torch_state_dict from huggingface_hub insted.
save_torch_state_dict
huggingface_hub
https://github.com/casper-hansen/AutoAWQ/compare/v0.2.6...v0.2.7#diff-068b6780bfd41edd049c82b641c984b3f7e5278b8ef412d29cdba6a62663704aR305-R310
We will fix the bug soon after we finish our current work.
This import is failing with the latest transformers version:
https://github.com/NetEase-FuXi/EETQ/blob/81e0b14d64088d58ef6acd2c8f3e788d59324407/python/eetq/models/base.py#L13
I checked how others solved this and it looks like in AWQ, this import could be removed by using the
save_torch_state_dict
fromhuggingface_hub
insted.https://github.com/casper-hansen/AutoAWQ/compare/v0.2.6...v0.2.7#diff-068b6780bfd41edd049c82b641c984b3f7e5278b8ef412d29cdba6a62663704aR305-R310