Thanks for making onediff possible. I am experimenting with my own custom attention processor, which I integrate into diffuser based unet via set_attn_processor:
However, when I compile my model with oneflow_compile, it always returned a NotImplementedError. The detail error message attached below:
ERROR: Traceback (most recent call last):
.
.
.
File "/venv/lib/python3.10/site-packages/diffusers/models/attention.py", line 329, in forward
attn_output = self.attn1(
File "/venv/lib/python3.10/site-packages/oneflow/nn/graph/proxy.py", line 188, in __call__
result = self.__block_forward(*args, **kwargs)
File "/venv/lib/python3.10/site-packages/oneflow/nn/graph/proxy.py", line 238, in __block_forward
result = unbound_forward_of_module_instance(self, *args, **kwargs)
File "/venv/lib/python3.10/site-packages/infer_compiler_registry/register_diffusers/attention_processor_oflow.py", line 363, in forward
return self.processor(
File "/venv/lib/python3.10/site-packages/oneflow/nn/graph/proxy.py", line 188, in __call__
result = self.__block_forward(*args, **kwargs)
File "/venv/lib/python3.10/site-packages/oneflow/nn/graph/proxy.py", line 238, in __block_forward
result = unbound_forward_of_module_instance(self, *args, **kwargs)
File "/venv/lib/python3.10/site-packages/oneflow/nn/modules/module.py", line 200, in forward
raise NotImplementedError()
NotImplementedError
Is there a way that I can use a custom attention processor and onediff together? Thanks!
Another (perhaps important) detail is that my Attention Processor is a subclass of nn.module, that probably matter?
Hi,
Thanks for making onediff possible. I am experimenting with my own custom attention processor, which I integrate into
diffuser
basedunet
viaset_attn_processor
:However, when I compile my model with
oneflow_compile
, it always returned aNotImplementedError
. The detail error message attached below:Is there a way that I can use a custom attention processor and onediff together? Thanks!
Another (perhaps important) detail is that my Attention Processor is a subclass of
nn.module
, that probably matter?