Closed qiacheng closed 8 months ago
Remove scaled_dot_product_flash_attention code from attention_processor in diffusers.models to support public torch 2.1.0 package
this op is supported in beta openvino toolkit.
Remove scaled_dot_product_flash_attention code from attention_processor in diffusers.models to support public torch 2.1.0 package
this op is supported in beta openvino toolkit.