huggingface / optimum-quanto

A pytorch quantization backend for optimum
Apache License 2.0
776 stars 55 forks source link

Verify extension behaviour in google Colab #206

Open dacorvo opened 4 months ago

dacorvo commented 4 months ago

@kechan reported compilation failures when using quanto in Google Colab, both on CPU and GPU.

itsayellow commented 3 months ago

A test case is the colab notebook example from the Quanto documentation https://huggingface.co/docs/transformers/main/quantization/quanto#quanto

"Try Quanto + transformers with this notebook!"

https://colab.research.google.com/drive/16CXfVmtdQvciSh9BopZUDYcmXCDpvgrT?usp=sharing

When running the notebook with no modifications, I run both the "install the dependencies" cell and the "You can quantize a model" cell. When I trying to run the second "quantize" cell, it throws the following error:

---------------------------------------------------------------------------

NameError                                 Traceback (most recent call last)

[/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py](https://localhost:8080/#) in _get_module(self, module_name)
   1551         try:
-> 1552             return importlib.import_module("." + module_name, self.__name__)
   1553         except Exception as e:

15 frames

NameError: name 'torch' is not defined

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)

[/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py](https://localhost:8080/#) in _get_module(self, module_name)
   1552             return importlib.import_module("." + module_name, self.__name__)
   1553         except Exception as e:
-> 1554             raise RuntimeError(
   1555                 f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"
   1556                 f" traceback):\n{e}"

RuntimeError: Failed to import transformers.integrations.bitsandbytes because of the following error (look up to see its traceback):
name 'torch' is not defined

It seems like there's some sort of dynamic module loading that's failing, during the model = AutoModelForCausalLM.from_pretrained(... line

itsayellow commented 3 months ago

I get the same error trying to run code like this locally too, but I assumed the tutorial colab should be a clean environment.

CavidanZ commented 3 months ago

For me, I get AttributeError: 'WhisperForCausalLM' object has no attribute 'transformer' when running print(model.transformer.h[0].self_attention.dense.weight)

github-actions[bot] commented 2 months ago

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] commented 1 month ago

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] commented 1 month ago

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] commented 5 days ago

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.