kousw / experimental-consistory

MIT License
104 stars 6 forks source link

Missing call for the subject driven self attention #7

Open ValMystletainn opened 7 months ago

ValMystletainn commented 7 months ago

In the original paper of consistory, they claim the SDSA(subject driven self attention) trick to help the concsistency improvement. And you have done a SelfSubjectReaderWriterMixin, changing the UNet2dCondition to modified version SDUNet2dCondition

But I cannot find any calling of methods from SelfSubjectReaderWriterMixin, so I think the SDSA trick doesn't work right now. I'm not sure about my finding, so I'm raising this issue

kousw commented 7 months ago

Sorry for the confusion. The SDSA (Subject Driven Self Attention) mentioned in the original consistory paper is implemented in transformer block and attention processor. This repository was customized from my other project, DreamTuner, and therefore the SelfSubjectReaderWriterMixin code is unused in this repository.