ltdrdata / ComfyUI-Inspire-Pack

This repository offers various extension nodes for ComfyUI. Nodes here have different characteristics compared to those in the ComfyUI Impact Pack. The Impact Pack has become too large now...
GNU General Public License v3.0
418 stars 49 forks source link

Load Prompts From File (Inspire) uses batch size or batch count? #153

Closed CocoPuffLeo closed 2 months ago

CocoPuffLeo commented 2 months ago

Hello Itdrdata

I am currently experimenting with your Load Prompts From File node in Comfyui, I created some txt prompts and have been generating some images. During multiple prompts in one single txt file generation, I have realized that clicking Queue Prompt will only result in one generation in the Running section, instead of having multiple generations loading behind like batch count.

My main concern is does Load Prompts From File generate images using batch size or batch count? Because I am planning to add a lot of different prompts inside the txt file, so I can let it generate images in sequence overnight. If the custom node uses batch size, I am worried that my PC won't have enough VRAM to handle the load, and ended up crashing.

Thank you for creating this node, looking forward to your reply soon.

ltdrdata commented 2 months ago

Batch size is a concept that only applies to latent dimensions. In the case of latent, one dimension of the tensor corresponds to the batch dimension. Sampling is done using one chunk of this tensor.

The prompts are simply a list of texts and cannot be in batch form.