Closed chenjjcccc closed 8 months ago
ppdiffusers里attention.py里GEGLU模块缺少
def gelu(self, gate): if gate.device.type != "mps": return F.gelu(gate)
,但这并不在代码差异中,需要更改吗
似乎缺少models/attention.py的单测
ppdiffusers里attention.py里GEGLU模块缺少
def gelu(self, gate): if gate.device.type != "mps": return F.gelu(gate)
,但这并不在代码差异中,需要更改吗
不需要
似乎缺少models/attention.py的单测
如果torch 的diffusers里有这个单测则添加上去,如果没有就不需要添加
diffusers的attention_processor.py里存在的pytorch2.0版本的类需要在ppdiffusers里实现吗 像: AttnAddedKVProcessor2_0,AttnProcessor2_0,LoRAAttnProcessor2_0
缺少2_0的类会导致新增的
ADDED_KV_ATTENTION_PROCESSORS = (
AttnAddedKVProcessor,
SlicedAttnAddedKVProcessor,
AttnAddedKVProcessor2_0,
XFormersAttnAddedKVProcessor,
LoRAAttnAddedKVProcessor,
)
CROSS_ATTENTION_PROCESSORS = (
AttnProcessor,
AttnProcessor2_0,
XFormersAttnProcessor,
SlicedAttnProcessor,
LoRAAttnProcessor,
LoRAAttnProcessor2_0,
LoRAXFormersAttnProcessor,
)
AttentionProcessor = Union[
AttnProcessor,
AttnProcessor2_0,
XFormersAttnProcessor,
SlicedAttnProcessor,
AttnAddedKVProcessor,
SlicedAttnAddedKVProcessor,
AttnAddedKVProcessor2_0,
XFormersAttnAddedKVProcessor,
CustomDiffusionAttnProcessor,
CustomDiffusionXFormersAttnProcessor,
# depraceted
LoRAAttnProcessor,
LoRAAttnProcessor2_0,
LoRAXFormersAttnProcessor,
LoRAAttnAddedKVProcessor,
]
部分出现类似于"AttnProcessor2_0" is not defined的warning
@LokeZhou 请问这个PR https://github.com/PaddlePaddle/PaddleMIX/pull/322 是否很快就能合入。
@LokeZhou 请问这个PR #322 是否很快就能合入。
ci通过后合入
diffusers的attention_processor.py里存在的pytorch2.0版本的类需要在ppdiffusers里实现吗 像: AttnAddedKVProcessor2_0,AttnProcessor2_0,LoRAAttnProcessor2_0
如果功能一致不需要;如果是因为2.0版本导致新增功能,则需要。
任务已完成 by @co63oc ,close issue。
PaddleMIX ppdiffusers中升级attention相关代码
任务描述
任务背景
完成步骤
提交内容:
题目更新:
当前还请按照diffusers最新稳定版本0.23.1更新升级