Closed ZZQ987 closed 4 weeks ago
more details:
torch.autograd.grad(loss_hfp.requiresgrad(True), [z])
RuntimeError: One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior.
loss_hfp.requiresgrad(True).grad_fn = None
z.grad_fn = None
requires_grad seems to be ineffective
Thank you for your attention to our work.
The disable_hf_guidance
parameter has been deprecated in the release version and should always be set to True
. We found that online hf sampling was somewhat slow for practical use, so in this released code version, we pre-trained the same LoRA to speed up inference while maintaining high-frequency details. You can try setting enable_decoder_cond_lora
to False
in the config YAML if you want to see the results without hf-guidance.
Hello, great job! However, I encountered a small issue. When I set the “disable_hf_guidance” parameter to False to enable hf_guidance, I received the following error:
Have you encountered a similar situation? My environment configuration is the same as project ip2p.