SeargeDP / ComfyUI_Searge_LLM

Custom nodes for ComfyUI that utilize a language model to generate text-to-image prompts
MIT License
27 stars 1 forks source link

Any idea about minimum requirements ? #4

Open JorgeR81 opened 2 weeks ago

JorgeR81 commented 2 weeks ago

I can run Flux, but it's slow. 

I don't have experience using LLM's locally.  It's there a minimum requirement to run this LLM ?

I wouldn't mind running only the LLM, in a separate workflow, just to get the prompts.  

I have:

SeargeDP commented 2 weeks ago

Should be fine, the lowest end machine I tried it on only has 6GB of VRam and could run both the LLM node and a simple Flux image generation in the same workflow with the Q4 GGUF models.