Open nickyreinert opened 1 month ago
When manually assembling the pipeline I can narrow it down to the transformer:
transformer = FluxTransformer2DModel.from_pretrained(bfl_repo, subfolder="transformer", torch_dtype=dtype, revision=revision).to('mps')
Which kind of makes sense, they are about 24 GByte, which most probably leads to the memory exception:
https://huggingface.co/black-forest-labs/FLUX.1-schnell/tree/main/transformer
@nickyreinert I have recently released MFLUX which can currently run the Schnell model on Apple Silicon using their new MLX framework. With 36GB of memory it should work fine (I have personally tested it on my 32GB machine, but others have gotten it to work with 16GB also)
@filipstrand Working like a charm! Comparison measurement: M3 Pro /w 36 GB takes:
real 1m43.003s user 0m12.328s sys 0m48.153s
is pipeline.safety_checker is working for you?
I am trying to find the correct setup to run it on M3/36GB memory, without success. Error message is either (running it via a Gradle UI (ref: pictero.com):
or this (running on Jupyter):
This is my pipeline config: