bghira / SimpleTuner

A general fine-tuning kit geared toward diffusion models.
GNU Affero General Public License v3.0
1.69k stars 151 forks source link

If you use attention masking and lycoris, any config with `FluxTransformer2DModel` will not be seen because the class used is `FluxTransformer2DModelWithMasking` #874

Closed AmericanPresidentJimmyCarter closed 1 month ago

AmericanPresidentJimmyCarter commented 1 month ago
2024-08-25 23:19:47|[LyCORIS]-INFO: Bypass mode is enabled
2024-08-25 23:19:47|[LyCORIS]-INFO: Full matrix mode for LoKr is enabled
2024-08-25 23:19:47|[LyCORIS]-INFO: Using rank adaptation algo: lokr
2024-08-25 23:19:47|[LyCORIS]-INFO: Use Dropout value: 0.0
2024-08-25 23:19:47|[LyCORIS]-INFO: Create LyCORIS Module
2024-08-25 23:19:47|[LyCORIS]-INFO: create LyCORIS: 0 modules.
2024-08-25 23:19:47|[LyCORIS]-INFO: module type table: {}
2024-08-25 23:19:47,515 [INFO] (__main__) LyCORIS network has been initialized with 0 parameters
bghira commented 1 month ago

it seems like there's "the sky is the limit" with user errors in these complex configurations.

the default approach just has users use Attention and FeedForward for this reason, it's fairly universal.

AmericanPresidentJimmyCarter commented 1 month ago

I will close but the issue will be here if anyone else hits it.