casper-hansen / AutoAWQ

AutoAWQ implements the AWQ algorithm for 4-bit quantization with a 2x speedup during inference. Documentation:
https://casper-hansen.github.io/AutoAWQ/
MIT License
1.62k stars 191 forks source link

cant import awq #559

Open Dujianhua1008 opened 1 month ago

Dujianhua1008 commented 1 month ago

from the new version,I build it but I cant import awq,

error like this

>>> import awq
Traceback (most recent call last):
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1586, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/models/auto/processing_auto.py", line 28, in <module>
    from ...image_processing_utils import ImageProcessingMixin
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/image_processing_utils.py", line 21, in <module>
    from .image_transforms import center_crop, normalize, rescale
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/image_transforms.py", line 22, in <module>
    from .image_utils import (
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/image_utils.py", line 58, in <module>
    from torchvision.transforms import InterpolationMode
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/torchvision/__init__.py", line 10, in <module>
    from torchvision import _meta_registrations, datasets, io, models, ops, transforms, utils  # usort:skip
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/torchvision/_meta_registrations.py", line 163, in <module>
    @torch.library.register_fake("torchvision::nms")
AttributeError: module 'torch.library' has no attribute 'register_fake'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/nas/djh/kernels/AutoAWQ/awq/__init__.py", line 2, in <module>
    from awq.models.auto import AutoAWQForCausalLM
  File "/nas/djh/kernels/AutoAWQ/awq/models/__init__.py", line 1, in <module>
    from .mpt import MptAWQForCausalLM
  File "/nas/djh/kernels/AutoAWQ/awq/models/mpt.py", line 1, in <module>
    from .base import BaseAWQForCausalLM
  File "/nas/djh/kernels/AutoAWQ/awq/models/base.py", line 35, in <module>
    from transformers import (
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1577, in __getattr__
    value = getattr(module, name)
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1576, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/nas/djh/miniconda3/envs/awq/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1588, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.auto.processing_auto because of the following error (look up to see its traceback):
module 'torch.library' has no attribute 'register_fake'
>>> 
casper-hansen commented 1 month ago

I think you may have an issue with your torch installation. Try to reinstall torch

Ali-Flt commented 1 month ago

Got the same issue. I installed torch using conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia and cloned the autoawq repo and installed it from source.

Dujianhua1008 commented 1 month ago

Got the same issue. I installed torch using conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia and cloned the autoawq repo and installed it from source.

hello,I dont know which torch suit me,so I pip install autoawq and then uninstall it,later I built awq from source ,Its OK。

NamburiSrinath commented 2 weeks ago

I reinstalled torch and still the issue persists. I tried installing from the source as well but didn't get it working.

@casper-hansen any idea how to overcome this?

autoawq_kernels==0.0.7
torch==2.3.1
torchaudio==2.4.0+cu118
torchvision==0.19.0+cu118

I am using an EC2 instance (g5.24xlarge) i.e 4xA10s

NVIDIA-SMI 535.183.01 Driver Version: 535.183.01 CUDA Version: 12.2

Ali-Flt commented 2 weeks ago

@NamburiSrinath try this:

  1. Create a new python env
  2. Install torch with pip
  3. Install AutoAWQ from source but with --no-build-isolation argument