Open unclemusclez opened 1 week ago
same here
[START] Security scan [DONE] Security scan
ComfyUI startup time: 2024-06-25 12:52:59.377751 Platform: Linux Python version: 3.12.3 | packaged by Anaconda, Inc. | (main, May 6 2024, 19:46:43) [GCC 11.2.0] Python executable: /root/miniconda3/bin/python ** Log path: /home/wuxm/code/ComfyUI/comfyui.log
Prestartup times for custom nodes: 0.3 seconds: /home/wuxm/code/ComfyUI/custom_nodes/ComfyUI-Manager
Total VRAM 16039 MB, total RAM 31916 MB
pytorch version: 2.5.0.dev20240623+cu124
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4080 SUPER : cudaMallocAsync
Using pytorch cross attention
/root/miniconda3/lib/python3.12/site-packages/kornia/feature/lightglue.py:44: FutureWarning: torch.cuda.amp.custom_fwd(args...)
is deprecated. Please use torch.amp.custom_fwd(args..., device_type='cuda')
instead.
@torch.cuda.amp.custom_fwd(cast_inputs=torch.float32)
/root/miniconda3/lib/python3.12/site-packages/diffusers/models/transformers/transformer_2d.py:34: FutureWarning: Transformer2DModelOutput
is deprecated and will be removed in version 1.0.0. Importing Transformer2DModelOutput
from diffusers.models.transformer_2d
is deprecated and this will be removed in a future version. Please use from diffusers.models.modeling_outputs import Transformer2DModelOutput
, instead.
deprecate("Transformer2DModelOutput", "1.0.0", deprecation_message)
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
Traceback (most recent call last):
File "/home/wuxm/code/ComfyUI/nodes.py", line 1906, in load_custom_node
module_spec.loader.exec_module(module)
File "
Cannot import /home/wuxm/code/ComfyUI/custom_nodes/ComfyUI-3D-Pack module for custom nodes: Cannot subclass _TensorBase directly Total VRAM 16039 MB, total RAM 31916 MB pytorch version: 2.5.0.dev20240623+cu124 Set vram state to: NORMAL_VRAM Device: cuda:0 NVIDIA GeForce RTX 4080 SUPER : cudaMallocAsync
Import times for custom nodes: 0.0 seconds: /home/wuxm/code/ComfyUI/custom_nodes/websocket_image_save.py 0.0 seconds: /home/wuxm/code/ComfyUI/custom_nodes/ComfyUI_IPAdapter_plus 0.0 seconds: /home/wuxm/code/ComfyUI/custom_nodes/ComfyUI-AnimateDiff-Evolved 0.0 seconds: /home/wuxm/code/ComfyUI/custom_nodes/ComfyUI-Manager 0.0 seconds: /home/wuxm/code/ComfyUI/custom_nodes/ComfyUI-VideoHelperSuite 0.1 seconds: /home/wuxm/code/ComfyUI/custom_nodes/ComfyUI-KJNodes 0.4 seconds (IMPORT FAILED): /home/wuxm/code/ComfyUI/custom_nodes/ComfyUI-3D-Pack
Starting server
To see the GUI go to: http://172.17.0.2:8188 FETCH DATA from: /home/wuxm/code/ComfyUI/custom_nodes/ComfyUI-Manager/extension-node-map.json [DONE]
any update?
@meanmee you may need to compile https://github.com/graphdeco-inria/gaussian-splatting submodule https://github.com/graphdeco-inria/diff-gaussian-rasterization/tree/59f5f77e3ddbac3ed9db93ec2cfe99ed6c5d121d
i'm having this same issue but i'm on the ROCm platform, and currently https://github.com/ROCm/HIP-CPU/issues/60 may be related
ComfyUI on WSL2 with ROCm