Closed CocoPuffLeo closed 2 months ago
Batch size is a concept that only applies to latent dimensions. In the case of latent, one dimension of the tensor corresponds to the batch dimension. Sampling is done using one chunk of this tensor.
The prompts are simply a list of texts and cannot be in batch form.
Hello Itdrdata
I am currently experimenting with your Load Prompts From File node in Comfyui, I created some txt prompts and have been generating some images. During multiple prompts in one single txt file generation, I have realized that clicking Queue Prompt will only result in one generation in the Running section, instead of having multiple generations loading behind like batch count.
My main concern is does Load Prompts From File generate images using batch size or batch count? Because I am planning to add a lot of different prompts inside the txt file, so I can let it generate images in sequence overnight. If the custom node uses batch size, I am worried that my PC won't have enough VRAM to handle the load, and ended up crashing.
Thank you for creating this node, looking forward to your reply soon.