phineas-pta / comfy-trt-test

attempt to use TensorRT with ComfyUI
Apache License 2.0
82 stars 3 forks source link

KSampler error #10

Closed dotada closed 9 months ago

dotada commented 9 months ago

ok I know you and I just finished chatting from the previous issue but here comes another one! built an engine using these args: --batch_min 1 --batch_opt 1 --batch_max 1 --height_min 512 --height_opt 1024 --height_max 1024 --width_min 512 --width_opt 1024 --width_max 1024 --token_count_min 75 --token_count_opt 150 --token_count_max 500 --ckpt_path ..\..\..\..\Models\StableDiffusion\anythingv5.safetensors and the build went perfectly without a single error but when I try to gen an image generated I get the error

Error occurred when executing KSampler:

__len__() should return >= 0

File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\nodes.py", line 1380, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\nodes.py", line 1350, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\sample.py", line 100, in sample
samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\samplers.py", line 713, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\samplers.py", line 618, in sample
samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\samplers.py", line 557, in sample
samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\venv\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\k_diffusion\sampling.py", line 154, in sample_euler_ancestral
denoised = model(x, sigmas[i] * s_in, **extra_args)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\venv\lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\venv\lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\samplers.py", line 281, in forward
out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\venv\lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\venv\lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\samplers.py", line 271, in forward
return self.apply_model(*args, **kwargs)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\samplers.py", line 268, in apply_model
out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\samplers.py", line 248, in sampling_function
cond_pred, uncond_pred = calc_cond_uncond_batch(model, cond, uncond_, x, timestep, model_options)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\comfy\samplers.py", line 222, in calc_cond_uncond_batch
output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\custom_nodes\comfy-trt-test\comfy_trt\node_unet.py", line 148, in apply_model
model_output = self.diffusion_model.forward(x=xc, timesteps=t, context=context, **extra_conds).float()
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\custom_nodes\comfy-trt-test\comfy_trt\node_unet.py", line 207, in forward
self.engine.allocate_buffers(feed_dict)
File "E:\StabilityMatrix-win-x64\data\Packages\ComfyUI\custom_nodes\comfy-trt-test\comfy_trt\utilities.py", line 212, in allocate_buffers
tensor = torch.empty(tuple(shape), dtype=numpy_to_torch_dtype_dict[dtype]).to(device=device)

so uhh, have fun!

phineas-pta commented 9 months ago

very obscure error ...

can u share the comfy workflow

dotada commented 9 months ago

workflow.json do be aware you'll probably be missing some custom nodes such as rgthree, one button prompt, WD14 tagger or comfy-image-saver quick edit: I forgot to add that I temporarily disconnected the 2K and 4K upscales when testing out TensorRT

phineas-pta commented 9 months ago

sorry there's too many nodes & models i dont have 😂

my guess of the problem is control net (not yet supported in tensorrt, even the original a1111 extension)

it may be too much to ask, but if u really have a lot free time, can u start from the basic workflow and slowly add things to see when it breaks

dotada commented 9 months ago

Yeah, sure, I'll try later when I get home

dotada commented 9 months ago

image that's about as simple as I can get it and it still errors

phineas-pta commented 9 months ago

hmmm it works on my pc 😂😂😂 i took the anything v5 same as u, with same batch_min, token_count_min, etc.

u have the latest comfy ? what about torch version ? cuda - cudnn - tensorrt version ? no xformers ? u see any onnx file and trt file created ? on my end i have a 1.6gb onnx file + 1.6gb trt file

dotada commented 9 months ago

hmmm it works on my pc 😂😂😂 i took the anything v5 same as u, with same batch_min, token_count_min, etc.

u have the latest comfy ? what about torch version ? cuda - cudnn - tensorrt version ? no xformers ? u see any onnx file and trt file created ? on my end i have a 1.6gb onnx file + 1.6gb trt file

latest comfy? yes, toch version would be 2.1.0+cu121, cuda would be 12.1.1_531.14, cudnn is 8.9.7.29_cuda12, tensorrt is 9.3.0.post12.dev1, xformers isn't installed and both my onnx and trt files are 1.7gb

phineas-pta commented 9 months ago

hmmm all seem fine 🤔 ...

the only difference is i have comfy standalone not in stability matrix, but it shouldnt be a problem

phineas-pta commented 9 months ago

maybe this is your case: NVIDIA/Stable-Diffusion-WebUI-TensorRT#230

try lower token_count_max and/or higher batch_max

no idea why it's works on my pc though 😅😅😅

dotada commented 9 months ago

gonna try lowering token_count_max to 200 and increasing batch_max to 4

dotada commented 9 months ago

well well well then, 1 of those 2 was causing the error, the above settings seem to work

dotada commented 9 months ago

gonna close the issue and hopefully this may help someone in the future lol