I've been fine tuning SDXL on a 24GB 4090 card for a while now, and I get good results from this. But I'd like to upgrade to a newer model like Lumina.
I think that you do support single-GPU fine tuning of your Lumina_T2X_Next model, but currently it doesn't fit on a 24GB GPU, is that correct?
This image in your readme suggests that you support smaller parts than your standard Lumina_T2X_Next model currently uses, such as CLIP-L/G, an 0.6B DiT model, and the Stable Diffusion 1.5 VAE:
Would I need you to release a specific cut-down model that uses these smaller parts if I wanted to train Lumina-T2X on my 24GB GPU, or is this something I could get working myself with what you've already released?
Hello,
I've been fine tuning SDXL on a 24GB 4090 card for a while now, and I get good results from this. But I'd like to upgrade to a newer model like Lumina.
I think that you do support single-GPU fine tuning of your Lumina_T2X_Next model, but currently it doesn't fit on a 24GB GPU, is that correct?
This image in your readme suggests that you support smaller parts than your standard Lumina_T2X_Next model currently uses, such as CLIP-L/G, an 0.6B DiT model, and the Stable Diffusion 1.5 VAE:
Would I need you to release a specific cut-down model that uses these smaller parts if I wanted to train Lumina-T2X on my 24GB GPU, or is this something I could get working myself with what you've already released?
Thanks for any information.