XLabs-AI / x-flux-comfyui

Apache License 2.0
1.08k stars 71 forks source link

Failed to Import - hqq_aten package not installed. HQQBackend.ATEN backend will not work unless you install the hqq_aten lib in hqq/kernels. #56

Closed F0xbite closed 2 months ago

F0xbite commented 2 months ago

I've followed the setup instructions, custom node fails to import. I could not find any info on the hqq_aten package or how to install it. I'm not even certain that it is the root issue. Any clues are appreciated.


hqq_aten package not installed. HQQBackend.ATEN backend will not work unless you install the hqq_aten lib in hqq/kernels.
Traceback (most recent call last):
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1603, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\importlib\__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\generation\configuration_utils.py", line 48, in <module>
    from ..cache_utils import QuantizedCacheConfig
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\cache_utils.py", line 21, in <module>
    from hqq.core.quantize import Quantizer as HQQQuantizer
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\hqq\core\quantize.py", line 133, in <module>
    class HQQLinear(torch.nn.Module):
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\hqq\core\quantize.py", line 227, in HQQLinear
    def forward_pytorch_compile(self, x):
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\__init__.py", line 1705, in fn
    return compile(model,
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\__init__.py", line 1723, in compile
    return torch._dynamo.optimize(backend=backend, nopython=fullgraph, dynamic=dynamic, disable=disable)(model)
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\_dynamo\eval_frame.py", line 583, in optimize
    check_if_dynamo_supported()
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\_dynamo\eval_frame.py", line 535, in check_if_dynamo_supported
    raise RuntimeError("Windows not yet supported for torch.compile")
RuntimeError: Windows not yet supported for torch.compile

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1603, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\importlib\__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\clip\modeling_clip.py", line 28, in <module>
    from ...modeling_utils import PreTrainedModel
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\modeling_utils.py", line 46, in <module>
    from .generation import GenerationConfig, GenerationMixin
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1593, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1605, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.generation.configuration_utils because of the following error (look up to see its traceback):
Windows not yet supported for torch.compile

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\comfyui\ComfyUI\nodes.py", line 1993, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "C:\comfyui\ComfyUI\custom_nodes\x-flux-comfyui\__init__.py", line 1, in <module>
    from .nodes import NODE_CLASS_MAPPINGS, NODE_DISPLAY_NAME_MAPPINGS
  File "C:\comfyui\ComfyUI\custom_nodes\x-flux-comfyui\nodes.py", line 12, in <module>
    from .xflux.src.flux.util import (configs, load_ae, load_clip,
  File "C:\comfyui\ComfyUI\custom_nodes\x-flux-comfyui\xflux\src\flux\util.py", line 16, in <module>
    from .modules.conditioner import HFEmbedder
  File "C:\comfyui\ComfyUI\custom_nodes\x-flux-comfyui\xflux\src\flux\modules\conditioner.py", line 2, in <module>
    from transformers import (CLIPTextModel, CLIPTokenizer, T5EncoderModel,
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1594, in __getattr__
    value = getattr(module, name)
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1593, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1605, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.clip.modeling_clip because of the following error (look up to see its traceback):
Failed to import transformers.generation.configuration_utils because of the following error (look up to see its traceback):
Windows not yet supported for torch.compile

Cannot import C:\comfyui\ComfyUI\custom_nodes\x-flux-comfyui module for custom nodes: Failed to import transformers.models.clip.modeling_clip because of the following error (look up to see its traceback):
Failed to import transformers.generation.configuration_utils because of the following error (look up to see its traceback):
Windows not yet supported for torch.compile
F0xbite commented 2 months ago

This was because of a broken ComfyUI install, my fault. I was using my system python instead of the embedded one and everything was basically wrong because of that. I reinstalled using the standalone windows install and everything is fine.