kijai / ComfyUI-CogVideoXWrapper

989 stars 59 forks source link

22GB VRAM but still got an Out of Memory error from ToraEncodeTrajectory #183

Closed shirubei closed 3 weeks ago

shirubei commented 4 weeks ago

Hello, I have 22GB VRAM but still got an Out of Memory error.

I've had these settings: fp8_transformer: enabled enable_sequencial_pu_offload: true enable_tilling (of ImageEncoder): true

Any suggestion is appreciated.

ComfyUI Error Report

Error Details

## System Information
- **ComfyUI Version:** v0.2.2-80-g8733191
- **Arguments:** ComfyUI\main.py --windows-standalone-build
- **OS:** nt
- **Python Version:** 3.11.6 (tags/v3.11.6:8b6ee5b, Oct  2 2023, 14:57:12) [MSC v.1935 64 bit (AMD64)]
- **Embedded Python:** true
- **PyTorch Version:** 2.4.1+cu124
## Devices

- **Name:** cuda:0 NVIDIA GeForce RTX 2080 Ti : cudaMallocAsync
  - **Type:** cuda
  - **VRAM Total:** 23621861376
  - **VRAM Free:** 20508306016
  - **Torch VRAM Total:** 1879048192
  - **Torch VRAM Free:** 115599968

## Logs

2024-10-25 23:48:01,561 - root - INFO - Total VRAM 22528 MB, total RAM 65447 MB 2024-10-25 23:48:01,561 - root - INFO - pytorch version: 2.4.1+cu124 2024-10-25 23:48:01,568 - root - INFO - Set vram state to: NORMAL_VRAM 2024-10-25 23:48:01,569 - root - INFO - Device: cuda:0 NVIDIA GeForce RTX 2080 Ti : cudaMallocAsync 2024-10-25 23:48:02,752 - root - INFO - Using pytorch cross attention 2024-10-25 23:48:03,873 - root - INFO - [Prompt Server] web root: F:\stable\ComfyUI\ComfyUI\web 2024-10-25 23:48:03,875 - root - INFO - Adding extra search path animatediff_models F:/stable/sd.webui/webui\extensions/sd-webui-animatediff/model 2024-10-25 23:48:03,875 - root - INFO - Adding extra search path animatediff_motion_lora F:/stable/sd.webui/webui\models/Lora 2024-10-25 23:48:03,875 - root - INFO - Adding extra search path checkpoints F:/stable/sd.webui/webui\models/Stable-diffusion 2024-10-25 23:48:03,875 - root - INFO - Adding extra search path configs F:/stable/sd.webui/webui\models/Stable-diffusion 2024-10-25 23:48:03,877 - root - INFO - Adding extra search path vae F:/stable/sd.webui/webui\models/VAE 2024-10-25 23:48:03,877 - root - INFO - Adding extra search path loras F:/stable/sd.webui/webui\models/Lora 2024-10-25 23:48:03,877 - root - INFO - Adding extra search path loras F:/stable/sd.webui/webui\models/LyCORIS 2024-10-25 23:48:03,877 - root - INFO - Adding extra search path upscale_models F:/stable/sd.webui/webui\models/ESRGAN 2024-10-25 23:48:03,877 - root - INFO - Adding extra search path upscale_models F:/stable/sd.webui/webui\models/RealESRGAN 2024-10-25 23:48:03,877 - root - INFO - Adding extra search path upscale_models F:/stable/sd.webui/webui\models/SwinIR 2024-10-25 23:48:03,877 - root - INFO - Adding extra search path embeddings F:/stable/sd.webui/webui\embeddings 2024-10-25 23:48:03,877 - root - INFO - Adding extra search path hypernetworks F:/stable/sd.webui/webui\models/hypernetworks 2024-10-25 23:48:03,877 - root - INFO - Adding extra search path controlnet F:/stable/sd.webui/webui\models/ControlNet 2024-10-25 23:48:05,724 - ComfyUI-CogVideoXWrapper.custom_cogvideox_transformer_3d - INFO - sageattn not found, using sdpa 2024-10-25 23:48:05,729 - ComfyUI-CogVideoXWrapper.cogvideox_fun.transformer_3d - INFO - sageattn not found, using sdpa 2024-10-25 23:48:05,731 - ComfyUI-CogVideoXWrapper.cogvideox_fun.fun_pab_transformer_3d - INFO - sageattn not found, using sdpa 2024-10-25 23:48:06,763 - root - INFO - Total VRAM 22528 MB, total RAM 65447 MB 2024-10-25 23:48:06,763 - root - INFO - pytorch version: 2.4.1+cu124 2024-10-25 23:48:06,765 - root - INFO - Set vram state to: NORMAL_VRAM 2024-10-25 23:48:06,765 - root - INFO - Device: cuda:0 NVIDIA GeForce RTX 2080 Ti : cudaMallocAsync 2024-10-25 23:48:17,698 - root - INFO - Import times for custom nodes: 2024-10-25 23:48:17,698 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\rembg-comfyui-node-better 2024-10-25 23:48:17,698 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\websocket_image_save.py 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\rembg-comfyui-node 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\AIGODLIKE-COMFYUI-TRANSLATION 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\cg-use-everywhere 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Alimama-ControlNet-compatible 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Text_Translation 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI_SLK_joy_caption_two 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-WD14-Tagger 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-enricos-nodes 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI_CatVTON_Wrapper 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Text_Image-Composite 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Frame-Interpolation 2024-10-25 23:48:17,700 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-GGUF 2024-10-25 23:48:17,701 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-CogVideoXWrapper 2024-10-25 23:48:17,701 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI_essentials 2024-10-25 23:48:17,701 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-MingNodes 2024-10-25 23:48:17,701 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI_MiniCPM-V-2_6-int4 2024-10-25 23:48:17,701 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-KJNodes 2024-10-25 23:48:17,701 - root - INFO - 0.0 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\comfyui_controlnet_aux 2024-10-25 23:48:17,701 - root - INFO - 0.1 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI_LayerStyle 2024-10-25 23:48:17,701 - root - INFO - 0.1 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-VideoHelperSuite 2024-10-25 23:48:17,701 - root - INFO - 0.1 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Phi 2024-10-25 23:48:17,701 - root - INFO - 0.4 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-CogVideoX-MZ 2024-10-25 23:48:17,701 - root - INFO - 0.4 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager 2024-10-25 23:48:17,701 - root - INFO - 0.5 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use 2024-10-25 23:48:17,701 - root - INFO - 0.5 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack 2024-10-25 23:48:17,701 - root - INFO - 0.8 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\batchImg-rembg-ComfyUI-nodes 2024-10-25 23:48:17,701 - root - INFO - 1.2 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-PuLID-Flux 2024-10-25 23:48:17,701 - root - INFO - 1.6 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI_VLM_nodes 2024-10-25 23:48:17,701 - root - INFO - 1.8 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Custom_Nodes_AlekPet 2024-10-25 23:48:17,701 - root - INFO - 5.4 seconds: F:\stable\ComfyUI\ComfyUI\custom_nodes\comfyui_LLM_party 2024-10-25 23:48:17,701 - root - INFO - 2024-10-25 23:48:17,712 - root - INFO - Starting server

2024-10-25 23:48:17,712 - root - INFO - To see the GUI go to: http://127.0.0.1:8188 2024-10-25 23:48:41,431 - root - INFO - got prompt 2024-10-25 23:49:30,033 - root - ERROR - !!! Exception during processing !!! Allocation on device 2024-10-25 23:49:30,037 - root - ERROR - Traceback (most recent call last): File "F:\stable\ComfyUI\ComfyUI\execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "F:\stable\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs results.append(getattr(obj, func)(inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\ComfyUI\custom_nodes\ComfyUI-CogVideoXWrapper\nodes.py", line 1147, in encode video_flow = vae.encode(video_flow).latent_dist.sample(generator) vae.config.scaling_factor ^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\diffusers\utils\accelerate_utils.py", line 46, in wrapper return method(self, args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\diffusers\models\autoencoders\autoencoder_kl_cogvideox.py", line 1130, in encode h = self._encode(x) ^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\diffusers\models\autoencoders\autoencoder_kl_cogvideox.py", line 1100, in _encode x_intermediate = self.encoder(x_intermediate) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1553, in _wrapped_call_impl return self._call_impl(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1562, in _call_impl return forward_call(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\diffusers\models\autoencoders\autoencoder_kl_cogvideox.py", line 733, in forward hidden_states = down_block(hidden_states, temb, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1553, in _wrapped_call_impl return self._call_impl(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1562, in _call_impl return forward_call(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\diffusers\models\autoencoders\autoencoder_kl_cogvideox.py", line 409, in forward hidden_states = resnet(hidden_states, temb, zq) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1553, in _wrapped_call_impl return self._call_impl(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1562, in _call_impl return forward_call(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\diffusers\models\autoencoders\autoencoder_kl_cogvideox.py", line 291, in forward hidden_states = self.conv1(hidden_states) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1553, in _wrapped_call_impl return self._call_impl(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1562, in _call_impl return forward_call(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\diffusers\models\autoencoders\autoencoder_kl_cogvideox.py", line 144, in forward output = self.conv(inputs) ^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1553, in _wrapped_call_impl return self._call_impl(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1562, in _call_impl return forward_call(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\accelerate\hooks.py", line 169, in new_forward output = module._old_forward(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\diffusers\models\autoencoders\autoencoder_kl_cogvideox.py", line 64, in forward return super().forward(input) ^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\conv.py", line 608, in forward return self._conv_forward(input, self.weight, self.bias) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\stable\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\conv.py", line 603, in _conv_forward return F.conv3d( ^^^^^^^^^ torch.OutOfMemoryError: Allocation on device

2024-10-25 23:49:30,039 - root - ERROR - Got an OOM, unloading all loaded models. 2024-10-25 23:49:30,715 - root - INFO - Prompt executed in 49.25 seconds

## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":75,"last_link_id":179,"nodes":[{"id":65,"type":"LayerUtility: JoyCaption2ExtraOptions","pos":{"0":-2982,"1":-711},"size":{"0":371.701416015625,"1":466},"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"extra_option","type":"JoyCaption2ExtraOption","links":[168],"slot_index":0,"shape":3,"label":"extra_option"}],"properties":{"Node name for S&R":"LayerUtility: JoyCaption2ExtraOptions"},"widgets_values":[true,true,true,true,true,true,true,true,true,true,true,true,true,true,true,true,true,""],"color":"rgba(38, 73, 116, 0.7)"},{"id":60,"type":"Anything Everywhere","pos":{"0":-492,"1":-1448},"size":{"0":239.40000915527344,"1":26},"flags":{},"order":8,"mode":0,"inputs":[{"name":"COGVIDEOPIPE","type":"*","link":157,"label":"COGVIDEOPIPE","color_on":""}],"outputs":[],"properties":{"Node name for S&R":"Anything Everywhere","group_restricted":0,"color_restricted":0},"widgets_values":[]},{"id":49,"type":"WD14Tagger|pysssss","pos":{"0":-2454,"1":-1561},"size":{"0":315,"1":220},"flags":{},"order":10,"mode":4,"inputs":[{"name":"image","type":"IMAGE","link":177,"label":"image"}],"outputs":[{"name":"STRING","type":"STRING","links":[171],"slot_index":0,"shape":6,"label":"STRING"}],"properties":{"Node name for S&R":"WD14Tagger|pysssss"},"widgets_values":["wd-v1-4-moat-tagger-v2",0.35,0.85,false,false,""]},{"id":72,"type":"IFRNet VFI","pos":{"0":41,"1":-1119},"size":{"0":320.0760803222656,"1":153.15753173828125},"flags":{},"order":20,"mode":0,"inputs":[{"name":"frames","type":"IMAGE","link":173,"label":"frames"},{"name":"optional_interpolation_states","type":"INTERPOLATION_STATES","link":null,"label":"optional_interpolation_states"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[174],"slot_index":0,"shape":3,"label":"IMAGE"}],"properties":{"Node name for S&R":"IFRNet VFI"},"widgets_values":["IFRNet_L_Vimeo90K.pth",10,2,1]},{"id":73,"type":"RIFE VFI","pos":{"0":45,"1":-814},"size":{"0":319.20001220703125,"1":200.15753173828125},"flags":{},"order":21,"mode":0,"inputs":[{"name":"frames","type":"IMAGE","link":174,"label":"frames"},{"name":"optional_interpolation_states","type":"INTERPOLATION_STATES","link":null,"label":"optional_interpolation_states"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[175],"slot_index":0,"shape":3,"label":"IMAGE"}],"properties":{"Node name for S&R":"RIFE VFI"},"widgets_values":["rife47.pth",10,2,true,true,1]},{"id":67,"type":"PreviewTextNode","pos":{"0":-2007,"1":-1659},"size":{"0":427.54217529296875,"1":189.99021911621094},"flags":{},"order":14,"mode":4,"inputs":[{"name":"text","type":"STRING","link":171,"widget":{"name":"text"},"label":"text"}],"outputs":[{"name":"STRING","type":"STRING","links":null,"shape":3,"label":"STRING"}],"properties":{"Node name for S&R":"PreviewTextNode"},"widgets_values":["","1girl, solo, looking_at_viewer, smile, open_mouth, brown_hair, shirt, brown_eyes, flower, outdoors, shoes, teeth, blurry, tree, depth_of_field, blurry_background, animal, fangs, sunlight, plant, child, nature, forest, tiger",true]},{"id":47,"type":"LayerUtility: JoyCaption2","pos":{"0":-2484,"1":-1175},"size":{"0":412.09600830078125,"1":337.8868408203125},"flags":{},"order":9,"mode":4,"inputs":[{"name":"image","type":"IMAGE","link":167,"label":"image"},{"name":"extra_options","type":"JoyCaption2ExtraOption","link":168,"label":"extra_options"}],"outputs":[{"name":"text","type":"STRING","links":[170],"slot_index":0,"shape":6,"label":"text"}],"properties":{"Node name for S&R":"LayerUtility: JoyCaption2"},"widgets_values":["Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2","cuda","nf4","text_model","Descriptive","any","",300,0.9,0.6,false],"color":"rgba(38, 73, 116, 0.7)"},{"id":68,"type":"PreviewTextNode","pos":{"0":-2008,"1":-1174},"size":{"0":480.1921691894531,"1":365.24322509765625},"flags":{},"order":13,"mode":4,"inputs":[{"name":"text","type":"STRING","link":170,"widget":{"name":"text"},"label":"text"}],"outputs":[{"name":"STRING","type":"STRING","links":[],"slot_index":0,"shape":3,"label":"STRING"}],"properties":{"Node name for S&R":"PreviewTextNode"},"widgets_values":["","This captivating illustration, created in a whimsical and dreamlike style, features a young girl with curly, shoulder-length brown hair and a bright smile, playfully riding on the back of a majestic orange tiger with black stripes. The lighting is warm and golden, with soft, sunlit rays illuminating the scene from above, casting a gentle glow on the characters and their surroundings.\n\nThe camera angle is a slight bird's-eye view, providing an intimate and engaging perspective on the scene. There is no watermark visible in the image.\n\nThe aesthetic quality of this illustration is very high, with a high level of detail and texture in the characters' and environment's depiction. The composition style is reminiscent of a fairy tale, with the leading lines of the tiger's path and the girl's outstretched arms creating a sense of movement and joy.\n\nThe depth of field is shallow, with the girl and the tiger in sharp focus, while the background, including the lush foliage and tree trunks, is blurred. The background is softly focused, with a subtle gradient of focus from the front to the back of the scene.\n\nThe lighting is primarily natural, with the warm sunlight casting a magical ambiance on the scene. The image is suitable for all ages and is sfw.",true]},{"id":54,"type":"CLIPLoader","pos":{"0":-1904,"1":-1420},"size":{"0":315,"1":82},"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"CLIP","type":"CLIP","links":[122,123],"slot_index":0,"shape":3,"label":"CLIP"}],"properties":{"Node name for S&R":"CLIPLoader"},"widgets_values":["t5xxl_fp8_e4m3fn.safetensors","sd3"]},{"id":56,"type":"CogVideoTextEncode","pos":{"0":-1492,"1":-1171},"size":{"0":400,"1":200},"flags":{},"order":7,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":123,"label":"clip"}],"outputs":[{"name":"conditioning","type":"CONDITIONING","links":[126],"slot_index":0,"shape":3,"label":"conditioning"}],"properties":{"Node name for S&R":"CogVideoTextEncode"},"widgets_values":["low resolution",1,true,true]},{"id":52,"type":"ImageResizeKJ","pos":{"0":-2363,"1":-622},"size":{"0":315,"1":266},"flags":{},"order":15,"mode":0,"inputs":[{"name":"image","type":"IMAGE","link":120,"label":"image"},{"name":"get_image_size","type":"IMAGE","link":null,"label":"get_image_size"},{"name":"width_input","type":"INT","link":149,"widget":{"name":"width_input"},"label":"width_input"},{"name":"height_input","type":"INT","link":150,"widget":{"name":"height_input"},"label":"height_input"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[121],"slot_index":0,"shape":3,"label":"IMAGE"},{"name":"width","type":"INT","links":null,"shape":3,"label":"width"},{"name":"height","type":"INT","links":null,"shape":3,"label":"height"}],"properties":{"Node name for S&R":"ImageResizeKJ"},"widgets_values":[720,480,"nearest-exact",false,2,0,0,"disabled"]},{"id":42,"type":"CogVideoSampler","pos":{"0":-899,"1":-1119},"size":{"0":405.5999755859375,"1":390},"flags":{},"order":18,"mode":0,"inputs":[{"name":"pipeline","type":"COGVIDEOPIPE","link":null,"label":"pipeline"},{"name":"positive","type":"CONDITIONING","link":125,"label":"positive"},{"name":"negative","type":"CONDITIONING","link":126,"label":"negative"},{"name":"samples","type":"LATENT","link":null,"label":"samples"},{"name":"image_cond_latents","type":"LATENT","link":128,"label":"image_cond_latents"},{"name":"context_options","type":"COGCONTEXT","link":null,"label":"context_options"},{"name":"controlnet","type":"COGVIDECONTROLNET","link":null,"label":"controlnet"},{"name":"tora_trajectory","type":"TORAFEATURES","link":129,"label":"tora_trajectory"}],"outputs":[{"name":"cogvideo_pipe","type":"COGVIDEOPIPE","links":[131],"slot_index":0,"shape":3,"label":"cogvideo_pipe"},{"name":"samples","type":"LATENT","links":[130],"slot_index":1,"shape":3,"label":"samples"}],"properties":{"Node name for S&R":"CogVideoSampler"},"widgets_values":[480,720,49,32,6,1057724427582977,"fixed","CogVideoXDPMScheduler",1]},{"id":55,"type":"CogVideoTextEncode","pos":{"0":-1459,"1":-1449},"size":{"0":400,"1":200},"flags":{},"order":6,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":122,"label":"clip"}],"outputs":[{"name":"conditioning","type":"CONDITIONING","links":[125],"slot_index":0,"shape":3,"label":"conditioning"}],"properties":{"Node name for S&R":"CogVideoTextEncode"},"widgets_values":["1girl, solo, looking_at_viewer, riding on a tiger, smile, open_mouth, brown_hair, shirt, brown_eyes, flower, outdoors, shoes, teeth, blurry, tree, depth_of_field, blurry_background, fangs, sunlight, plant, child, nature, wondering in forest",1,true,true]},{"id":43,"type":"CogVideoDecode","pos":{"0":-369,"1":-1119},"size":{"0":315,"1":198},"flags":{},"order":19,"mode":0,"inputs":[{"name":"pipeline","type":"COGVIDEOPIPE","link":131,"label":"pipeline"},{"name":"samples","type":"LATENT","link":130,"label":"samples"}],"outputs":[{"name":"images","type":"IMAGE","links":[173],"slot_index":0,"shape":3,"label":"images"}],"properties":{"Node name for S&R":"CogVideoDecode"},"widgets_values":[true,240,360,0.2,0.2,true]},{"id":74,"type":"VHS_VideoCombine","pos":{"0":490,"1":-813},"size":[598.7159423828125,637.8106282552083],"flags":{},"order":22,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":175,"label":"images"},{"name":"audio","type":"AUDIO","link":null,"label":"audio"},{"name":"meta_batch","type":"VHS_BatchManager","link":null,"label":"meta_batch"},{"name":"vae","type":"VAE","link":null,"label":"vae"}],"outputs":[{"name":"Filenames","type":"VHS_FILENAMES","links":null,"shape":3,"label":"Filenames"}],"properties":{"Node name for S&R":"VHS_VideoCombine"},"widgets_values":{"frame_rate":8,"loop_count":0,"filename_prefix":"AnimateDiff","format":"image/gif","pingpong":false,"save_output":true,"videopreview":{"hidden":false,"paused":false,"params":{"filename":"AnimateDiff_00001.gif","subfolder":"","type":"output","format":"image/gif","frame_rate":8},"muted":false}}},{"id":58,"type":"GetMaskSizeAndCount","pos":{"0":-2568,"1":-56},"size":{"0":264.5999755859375,"1":86},"flags":{},"order":11,"mode":0,"inputs":[{"name":"mask","type":"MASK","link":134,"label":"mask"}],"outputs":[{"name":"mask","type":"MASK","links":null,"shape":3,"label":"mask"},{"name":"720 width","type":"INT","links":[149,161],"slot_index":1,"shape":3,"label":"720 width"},{"name":"480 height","type":"INT","links":[150,162],"slot_index":2,"shape":3,"label":"480 height"},{"name":"16 count","type":"INT","links":[163],"slot_index":3,"shape":3,"label":"16 count"}],"properties":{"Node name for S&R":"GetMaskSizeAndCount"},"widgets_values":[]},{"id":40,"type":"DownloadAndLoadCogVideoModel","pos":{"0":-898,"1":-1447},"size":{"0":315,"1":194},"flags":{},"order":2,"mode":0,"inputs":[{"name":"pab_config","type":"PAB_CONFIG","link":null,"label":"pab_config"},{"name":"block_edit","type":"TRANSFORMERBLOCKS","link":null,"label":"block_edit"},{"name":"lora","type":"COGLORA","link":null,"label":"lora"}],"outputs":[{"name":"cogvideo_pipe","type":"COGVIDEOPIPE","links":[157],"slot_index":0,"shape":3,"label":"cogvideo_pipe"}],"properties":{"Node name for S&R":"DownloadAndLoadCogVideoModel"},"widgets_values":["THUDM/CogVideoX-5b-I2V","bf16","enabled","disabled",true]},{"id":51,"type":"LoadImage","pos":{"0":-2969,"1":-1117},"size":{"0":315,"1":314},"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[120,167,177],"slot_index":0,"shape":3,"label":"IMAGE"},{"name":"MASK","type":"MASK","links":null,"shape":3,"label":"MASK"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["ComfyUI_00426_half.png","image"]},{"id":50,"type":"SplineEditor","pos":{"0":-3550,"1":-112},"size":{"0":765,"1":910},"flags":{},"order":4,"mode":0,"inputs":[{"name":"bg_image","type":"IMAGE","link":null,"label":"bg_image"}],"outputs":[{"name":"mask","type":"MASK","links":[134],"slot_index":0,"shape":3,"label":"mask"},{"name":"coord_str","type":"STRING","links":[178,179],"slot_index":1,"shape":3,"label":"coord_str"},{"name":"float","type":"FLOAT","links":null,"shape":3,"label":"float"},{"name":"count","type":"INT","links":null,"shape":3,"label":"count"},{"name":"normalized_str","type":"STRING","links":null,"shape":3,"label":"normalized_str"}],"properties":{"Node name for S&R":"SplineEditor"},"widgets_values":["[{\"x\":0,\"y\":479},{\"x\":197.22999999999936,\"y\":392.0399999999987},{\"x\":417.4499999999986,\"y\":465.8499999999985}]","[{\"x\":0,\"y\":479},{\"x\":26.474985122680664,\"y\":467.3269958496094},{\"x\":53.005950927734375,\"y\":455.7828063964844},{\"x\":79.89686584472656,\"y\":445.1090393066406},{\"x\":107.28059387207031,\"y\":435.7760314941406},{\"x\":135.17478942871094,\"y\":428.1083068847656},{\"x\":163.54454040527344,\"y\":422.4590148925781},{\"x\":192.2898406982422,\"y\":419.2518310546875},{\"x\":221.20960998535156,\"y\":418.9117736816406},{\"x\":250.04505920410156,\"y\":421.2009582519531},{\"x\":278.64068603515625,\"y\":425.5801086425781},{\"x\":306.9302978515625,\"y\":431.6367492675781},{\"x\":334.8965759277344,\"y\":439.0487365722656},{\"x\":362.5581359863281,\"y\":447.53082275390625},{\"x\":390.0157775878906,\"y\":456.655029296875},{\"x\":417.45001220703125,\"y\":465.8500061035156}]",720,480,16,"path","basis",0.5,1,"list",0,1,null,null,null]},{"id":75,"type":"PreviewTextNode","pos":{"0":-2559,"1":153},"size":{"0":535.185302734375,"1":250.39987182617188},"flags":{},"order":12,"mode":0,"inputs":[{"name":"text","type":"STRING","link":178,"widget":{"name":"text"},"label":"text"}],"outputs":[{"name":"STRING","type":"STRING","links":null,"shape":3,"label":"STRING"}],"properties":{"Node name for S&R":"PreviewTextNode"},"widgets_values":["","[{\"x\": 0, \"y\": 479}, {\"x\": 26, \"y\": 467}, {\"x\": 53, \"y\": 456}, {\"x\": 80, \"y\": 445}, {\"x\": 107, \"y\": 436}, {\"x\": 135, \"y\": 428}, {\"x\": 164, \"y\": 422}, {\"x\": 192, \"y\": 419}, {\"x\": 221, \"y\": 419}, {\"x\": 250, \"y\": 421}, {\"x\": 279, \"y\": 426}, {\"x\": 307, \"y\": 432}, {\"x\": 335, \"y\": 439}, {\"x\": 363, \"y\": 448}, {\"x\": 390, \"y\": 457}, {\"x\": 417, \"y\": 466}]",true]},{"id":39,"type":"DownloadAndLoadToraModel","pos":{"0":-1819,"1":-461},"size":{"0":315.9208984375,"1":154.21656799316406},"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[{"name":"tora_model","type":"TORAMODEL","links":[124],"slot_index":0,"shape":3,"label":"tora_model"}],"properties":{"Node name for S&R":"DownloadAndLoadToraModel"},"widgets_values":["kijai/CogVideoX-5b-Tora"]},{"id":44,"type":"ToraEncodeTrajectory","pos":{"0":-1342,"1":-332},"size":{"0":355.20001220703125,"1":222},"flags":{},"order":16,"mode":0,"inputs":[{"name":"pipeline","type":"COGVIDEOPIPE","link":null,"label":"pipeline"},{"name":"tora_model","type":"TORAMODEL","link":124,"label":"tora_model"},{"name":"coordinates","type":"STRING","link":179,"widget":{"name":"coordinates"},"label":"coordinates"},{"name":"width","type":"INT","link":161,"widget":{"name":"width"},"label":"width"},{"name":"height","type":"INT","link":162,"widget":{"name":"height"},"label":"height"},{"name":"num_frames","type":"INT","link":163,"widget":{"name":"num_frames"},"label":"num_frames"}],"outputs":[{"name":"tora_trajectory","type":"TORAFEATURES","links":[129],"slot_index":0,"shape":3,"label":"tora_trajectory"},{"name":"video_flow_images","type":"IMAGE","links":null,"shape":3,"label":"video_flow_images"}],"properties":{"Node name for S&R":"ToraEncodeTrajectory"},"widgets_values":["",720,480,49,1,0,0.1]},{"id":53,"type":"CogVideoImageEncode","pos":{"0":-1866,"1":-711},"size":{"0":315,"1":122},"flags":{},"order":17,"mode":0,"inputs":[{"name":"pipeline","type":"COGVIDEOPIPE","link":null,"label":"pipeline"},{"name":"image","type":"IMAGE","link":121,"label":"image"},{"name":"mask","type":"MASK","link":null,"label":"mask"}],"outputs":[{"name":"samples","type":"LATENT","links":[128],"slot_index":0,"shape":3,"label":"samples"}],"properties":{"Node name for S&R":"CogVideoImageEncode"},"widgets_values":[16,true]}],"links":[[120,51,0,52,0,"IMAGE"],[121,52,0,53,1,"IMAGE"],[122,54,0,55,0,"CLIP"],[123,54,0,56,0,"CLIP"],[124,39,0,44,1,"TORAMODEL"],[125,55,0,42,1,"CONDITIONING"],[126,56,0,42,2,"CONDITIONING"],[128,53,0,42,4,"LATENT"],[129,44,0,42,7,"TORAFEATURES"],[130,42,1,43,1,"LATENT"],[131,42,0,43,0,"COGVIDEOPIPE"],[134,50,0,58,0,"MASK"],[149,58,1,52,2,"INT"],[150,58,2,52,3,"INT"],[157,40,0,60,0,"COGVIDEOPIPE"],[161,58,1,44,3,"INT"],[162,58,2,44,4,"INT"],[163,58,3,44,5,"INT"],[167,51,0,47,0,"IMAGE"],[168,65,0,47,1,"JoyCaption2ExtraOption"],[170,47,0,68,0,"STRING"],[171,49,0,67,0,"STRING"],[173,43,0,72,0,"IMAGE"],[174,72,0,73,0,"IMAGE"],[175,73,0,74,0,"IMAGE"],[177,51,0,49,0,"IMAGE"],[178,50,1,75,0,"STRING"],[179,50,1,44,2,"STRING"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.620921323059155,"offset":[2575.440715857113,1537.1683251769118]},"groupNodes":{}},"version":0.4}



## Additional Context
(Please add any additional context or steps to reproduce the error here)
tdrminglin commented 3 weeks ago

just copy the"Enable vae encode tiling " code and insert it before video_flow = vae.encode(video_flow)... The vram use dopped from 22G+ to 4G after I did this

shirubei commented 3 weeks ago

190

shirubei commented 3 weeks ago

Check #190 for solution.