Open SimeonZhang opened 1 week ago
https://github.com/CompVis/stable-diffusion/blob/21f890f9da3cfbeaba8e2ac3c425ee9e998d5229/ldm/modules/attention.py#L99
As I understand, other implementations of attention except for SpatialSelfAttention in this module are set with bias=False. Why is it different?
SpatialSelfAttention
Any explanation will be greatly appreciated.
https://github.com/CompVis/stable-diffusion/blob/21f890f9da3cfbeaba8e2ac3c425ee9e998d5229/ldm/modules/attention.py#L99
As I understand, other implementations of attention except for
SpatialSelfAttention
in this module are set with bias=False. Why is it different?Any explanation will be greatly appreciated.