cocktailpeanut / fluxgym

Dead simple FLUX LoRA training UI with LOW VRAM support
897 stars 67 forks source link

How to run the model in inference ? #35

Closed fefespn closed 3 weeks ago

fefespn commented 3 weeks ago

Hi, I am new to diffusion models and libraries and loras. how to run the output lora weights with the flux model after the training?

ForgetfulWasAria commented 3 weeks ago

You should just need to move the output model to where you normally put loras for flux and then select it in your UI.

fefespn commented 3 weeks ago

Thanks for your reply. I didn't use flux in the past, how do you do it ? huggingface library ? I tried to search in the code where does he load the model but couldn't find !

forensicmike commented 3 weeks ago

Thanks for your reply. I didn't use flux in the past, how do you do it ? huggingface library ? I tried to search in the code where does he load the model but couldn't find !

I found Forge webui easy- it's similar to A1111 but has a dedicated flux mode. Then you just go to the LoRA tab and you can click it to add it to your prompt for T2I.

one question maybe someone else can answer.... is the flux1-dev-bnb-nf4-v2 (quantized, 12GB checkpoint OK to use with our trained LoRA? or does it only work with original flux1-dev?

fefespn commented 3 weeks ago

thanks, I found the script. it's in sd-scripts/flux_minimal_inference.py

ygean commented 3 weeks ago

thanks, I found the script. it's in sd-scripts/flux_minimal_inference.py

Hi, how much vram will this script cost