Closed InuyashaLee closed 20 hours ago
cc: @yiyixuxu
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
I couldn't understand this issue but I think this is not relevant to us
apply_forward_hook
is a function in accelerate - do you want to open an issue on accelerate?
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
apply_forward_hook is a function in accelerate - do you want to open an issue on accelerate?
Describe the bug
After I put different models of my own diffusion pipeline based on DiffusionPipeline to different devices, in case the model's inputs and weights are not on the same device, I attach AlignDevicesHooks on the models, such as VAE.
However, when I call vae.encode or vae.decode, the error Expected all tensors to be on the same device, but found at least two devices still happens unless manually setting the device for the input before calling the function. Thus, I checked the source code of AlignDevicesHook and AutoencoderKL. Then I found a decorator named _apply_forwardhook which added to vae.encode and vae.decode.
I think there may be a mistake in _apply_forwardhook. I think this line of code self._hf_hook.pre_forward(self) should be args, kwargs = self._hf_hook.pre_forward(self, *args, **kwargs) . After modification, the previous error disappeared.
Reproduction
N/A
Logs
No response
System Info
N/A
Who can help?
No response