AIrjen / OneButtonPrompt

One Button Prompt
GNU General Public License v3.0
840 stars 85 forks source link

Failed to import into ComfyUI #216

Closed F0xbite closed 1 month ago

F0xbite commented 1 month ago

I tried my best to solve this myself by searching but couldn't find any relevant info. Any ideas? Thank in advance.

hqq_aten package not installed. HQQBackend.ATEN backend will not work unless you install the hqq_aten lib in hqq/kernels.
Traceback (most recent call last):
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1603, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\importlib\__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\generation\configuration_utils.py", line 48, in <module>
    from ..cache_utils import QuantizedCacheConfig
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\cache_utils.py", line 21, in <module>
    from hqq.core.quantize import Quantizer as HQQQuantizer
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\hqq\core\quantize.py", line 133, in <module>
    class HQQLinear(torch.nn.Module):
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\hqq\core\quantize.py", line 227, in HQQLinear
    def forward_pytorch_compile(self, x):
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\__init__.py", line 1705, in fn
    return compile(model,
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\__init__.py", line 1723, in compile
    return torch._dynamo.optimize(backend=backend, nopython=fullgraph, dynamic=dynamic, disable=disable)(model)
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\_dynamo\eval_frame.py", line 583, in optimize
    check_if_dynamo_supported()
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\_dynamo\eval_frame.py", line 535, in check_if_dynamo_supported
    raise RuntimeError("Windows not yet supported for torch.compile")
RuntimeError: Windows not yet supported for torch.compile

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1603, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\importlib\__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\t5\modeling_t5.py", line 37, in <module>
    from ...modeling_utils import PreTrainedModel
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\modeling_utils.py", line 46, in <module>
    from .generation import GenerationConfig, GenerationMixin
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1593, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1605, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.generation.configuration_utils because of the following error (look up to see its traceback):
Windows not yet supported for torch.compile

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\comfyui\ComfyUI\nodes.py", line 1993, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "C:\comfyui\ComfyUI\custom_nodes\OneButtonPrompt\__init__.py", line 9, in <module>
    from .OneButtonPromptNodes import NODE_CLASS_MAPPINGS, NODE_DISPLAY_NAME_MAPPINGS
  File "C:\comfyui\ComfyUI\custom_nodes\OneButtonPrompt\OneButtonPromptNodes.py", line 13, in <module>
    from .build_dynamic_prompt import *
  File "C:\comfyui\ComfyUI\custom_nodes\OneButtonPrompt\build_dynamic_prompt.py", line 17, in <module>
    from .superprompter.superprompter import *
  File "C:\comfyui\ComfyUI\custom_nodes\OneButtonPrompt\superprompter\superprompter.py", line 5, in <module>
    from transformers import T5Tokenizer, T5ForConditionalGeneration
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1594, in __getattr__
    value = getattr(module, name)
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1593, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "C:\Users\Foxbite\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\import_utils.py", line 1605, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.t5.modeling_t5 because of the following error (look up to see its traceback):
Failed to import transformers.generation.configuration_utils because of the following error (look up to see its traceback):
Windows not yet supported for torch.compile

Cannot import C:\comfyui\ComfyUI\custom_nodes\OneButtonPrompt module for custom nodes: Failed to import transformers.models.t5.modeling_t5 because of the following error (look up to see its traceback):
Failed to import transformers.generation.configuration_utils because of the following error (look up to see its traceback):
Windows not yet supported for torch.compile
F0xbite commented 1 month ago

I figured it out. To put it simply, my comfyui install was completely broken. I was using my system python instead of the embedded one. I reinstalled everything using the windows standalone version, and everything is fine now.