Open sjehan opened 2 months ago
The error message
module 'torch.nn' has no attribute 'RMSNorm'
appears to be a known bug with the Transformers library when used with versions of PyTorch prior to 2.4.0
As a solution, you can upgrade your PyTorch installation to version 2.4.0 by running the command
pip install torch==2.4.0
That was already the case. PS F:\ComfyUI_windows_portable\python_embeded> python.exe -m pip install torch==2.4.0 Requirement already satisfied: torch==2.4.0 in f:\python312\lib\site-packages (2.4.0) Requirement already satisfied: filelock in f:\python312\lib\site-packages (from torch==2.4.0) (3.15.4) Requirement already satisfied: typing-extensions>=4.8.0 in f:\python312\lib\site-packages (from torch==2.4.0) (4.12.2) Requirement already satisfied: sympy in f:\python312\lib\site-packages (from torch==2.4.0) (1.13.2) Requirement already satisfied: networkx in f:\python312\lib\site-packages (from torch==2.4.0) (3.3) Requirement already satisfied: jinja2 in f:\python312\lib\site-packages (from torch==2.4.0) (3.1.4) Requirement already satisfied: fsspec in f:\python312\lib\site-packages (from torch==2.4.0) (2024.6.1) Requirement already satisfied: setuptools in f:\python312\lib\site-packages (from torch==2.4.0) (73.0.1) Requirement already satisfied: MarkupSafe>=2.0 in f:\python312\lib\site-packages (from jinja2->torch==2.4.0) (2.1.5) Requirement already satisfied: mpmath<1.4,>=1.1.0 in f:\python312\lib\site-packages (from sympy->torch==2.4.0) (1.3.0) PS F:\ComfyUI_windows_portable\python_embeded>
The Qwen2-VL code has recently been integrated into the latest development version of Hugging Face Transformers, which is version 4.45.0.dev0. This version is not yet available on PyPI. You should build from source to take advantage of the new features and updates, by running the following command:
pip install git+https://github.com/huggingface/transformers
I did that: python.exe -m pip install docopt In my embedded python folder: F:\ComfyUI_windows_portable\python_embeded> python.exe -m pip install docopt
And still, I have, after reboot, the same error:
] error: subprocess-exited-with-error
[!]
[!] python setup.py egg_info did not run successfully.
[!] exit code: 1
[!]
[!] [6 lines of output]
[!] Traceback (most recent call last):
[!] File "
That was already the case. PS F:\ComfyUI_windows_portable\python_embeded> python.exe -m pip install torch==2.4.0 Requirement already satisfied: torch==2.4.0 in f:\python312\lib\site-packages (2.4.0) Requirement already satisfied: filelock in f:\python312\lib\site-packages (from torch==2.4.0) (3.15.4) Requirement already satisfied: typing-extensions>=4.8.0 in f:\python312\lib\site-packages (from torch==2.4.0) (4.12.2) Requirement already satisfied: sympy in f:\python312\lib\site-packages (from torch==2.4.0) (1.13.2) Requirement already satisfied: networkx in f:\python312\lib\site-packages (from torch==2.4.0) (3.3) Requirement already satisfied: jinja2 in f:\python312\lib\site-packages (from torch==2.4.0) (3.1.4) Requirement already satisfied: fsspec in f:\python312\lib\site-packages (from torch==2.4.0) (2024.6.1) Requirement already satisfied: setuptools in f:\python312\lib\site-packages (from torch==2.4.0) (73.0.1) Requirement already satisfied: MarkupSafe>=2.0 in f:\python312\lib\site-packages (from jinja2->torch==2.4.0) (2.1.5) Requirement already satisfied: mpmath<1.4,>=1.1.0 in f:\python312\lib\site-packages (from sympy->torch==2.4.0) (1.3.0) PS F:\ComfyUI_windows_portable\python_embeded>
I've identified the issue with your command. When you run python.exe -m pip install torch==2.4.0
, you are using the system Python interpreter rather than the interpreter in your current directory. This is why you are seeing the message indicating that torch==2.4.0
is already satisfied in f:\python312\Lib\site-packages
instead of F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages
.
To ensure you are installing torch
in the correct Python environment, you can try one of the following methods:
Use the Python interpreter in the current directory: If you have a python.exe
in your current directory, you can run it to install torch
by using the following command:
F:\ComfyUI_windows_portable\python_embeded\python.exe -m pip install torch==2.4.0
or
.\python.exe -m pip install torch==2.4.0
Note the .\
at the beginning, which indicates the use of the Python interpreter in the current directory.
Set environment variables: If you want the python.exe
in the command prompt to point to your portable Python environment, you can add the path to this environment to the system's PATH
environment variable. This way, when you type python.exe
, it will use your portable Python environment.
That was already the case. PS F:\ComfyUI_windows_portable\python_embeded> python.exe -m pip install torch==2.4.0 Requirement already satisfied: torch==2.4.0 in f:\python312\lib\site-packages (2.4.0) Requirement already satisfied: filelock in f:\python312\lib\site-packages (from torch==2.4.0) (3.15.4) Requirement already satisfied: typing-extensions>=4.8.0 in f:\python312\lib\site-packages (from torch==2.4.0) (4.12.2) Requirement already satisfied: sympy in f:\python312\lib\site-packages (from torch==2.4.0) (1.13.2) Requirement already satisfied: networkx in f:\python312\lib\site-packages (from torch==2.4.0) (3.3) Requirement already satisfied: jinja2 in f:\python312\lib\site-packages (from torch==2.4.0) (3.1.4) Requirement already satisfied: fsspec in f:\python312\lib\site-packages (from torch==2.4.0) (2024.6.1) Requirement already satisfied: setuptools in f:\python312\lib\site-packages (from torch==2.4.0) (73.0.1) Requirement already satisfied: MarkupSafe>=2.0 in f:\python312\lib\site-packages (from jinja2->torch==2.4.0) (2.1.5) Requirement already satisfied: mpmath<1.4,>=1.1.0 in f:\python312\lib\site-packages (from sympy->torch==2.4.0) (1.3.0) PS F:\ComfyUI_windows_portable\python_embeded>
I've identified the issue with your command. When you run
python.exe -m pip install torch==2.4.0
, you are using the system Python interpreter rather than the interpreter in your current directory. This is why you are seeing the message indicating thattorch==2.4.0
is already satisfied inf:\python312\Lib\site-packages
instead ofF:\ComfyUI_windows_portable\python_embeded\Lib\site-packages
.To ensure you are installing
torch
in the correct Python environment, you can try one of the following methods:
Use the Python interpreter in the current directory: If you have a
python.exe
in your current directory, you can run it to installtorch
by using the following command:F:\ComfyUI_windows_portable\python_embeded\python.exe -m pip install torch==2.4.0
or
.\python.exe -m pip install torch==2.4.0
Note the
.\
at the beginning, which indicates the use of the Python interpreter in the current directory.- Set environment variables: If you want the
python.exe
in the command prompt to point to your portable Python environment, you can add the path to this environment to the system'sPATH
environment variable. This way, when you typepython.exe
, it will use your portable Python environment.
Can the requirements be lowered? Installing the latest libraries may cause conflicts with other plugins
I followed exact instructions and the node import failed. Improve requirements.txt
Cannot import F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_Qwen2-VL-Instruct module for custom nodes: Failed to import transformers.models.qwen2_vl.modeling_qwen2_vl because of the following error (look up to see its traceback): Failed to import transformers.generation.utils because of the following error (look up to see its traceback): module 'torch.nn' has no attribute 'RMSNorm' Efficiency Nodes: Attempting to add Control Net options to the 'HiRes-Fix Script' Node (comfyui_controlnet_aux add-on)...Success!
[rgthree] Loaded 42 fantastic nodes. [rgthree] NOTE: Will NOT use rgthree's optimized recursive execution as ComfyUI has changed.
Traceback (most recent call last): File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\utils\import_utils.py", line 1659, in _get_module return importlib.import_module("." + module_name, self.name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "importlib__init__.py", line 126, in import_module File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\generation\utils.py", line 51, in
from ..pytorch_utils import isin_mps_friendly
File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\pytorch_utils.py", line 27, in
ALL_LAYERNORM_LAYERS = [nn.LayerNorm, nn.RMSNorm]
^^^^^^^^^^
AttributeError: module 'torch.nn' has no attribute 'RMSNorm'
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\utils\import_utils.py", line 1659, in _get_module return importlib.import_module("." + module_name, self.name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "importlib__init.py", line 126, in import_module File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\blip\modeling_blip.py", line 29, in
from ...modeling_utils import PreTrainedModel
File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 46, in
from .generation import GenerationConfig, GenerationMixin
File "", line 1229, in _handle_fromlist
File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\utils\import_utils.py", line 1649, in getattr__
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\utils\import_utils.py", line 1661, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
module 'torch.nn' has no attribute 'RMSNorm'
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "F:\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1993, in load_custom_node module_spec.loader.exec_module(module) File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\was-node-suite-comfyui__init.py", line 1, in
from .WAS_Node_Suite import NODE_CLASS_MAPPINGS
File "F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\was-node-suite-comfyui\WAS_Node_Suite.py", line 2412, in
from transformers import BlipProcessor, BlipForConditionalGeneration, BlipForQuestionAnswering
File "", line 1229, in _handle_fromlist
File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\utils\import_utils.py", line 1650, in getattr
value = getattr(module, name)
^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\utils\import_utils.py", line 1649, in getattr__
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\utils\import_utils.py", line 1661, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.blip.modeling_blip because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
module 'torch.nn' has no attribute 'RMSNorm'