Closed liming-ai closed 5 months ago
Would you mind opening an issue on their model repository? 'Cause this code seems to be the official one.
@sayakpaul
do you know why we make text_encoder
/tokenizer
optional and text_encoder2
/tokenizer2
required?
I think it's more natural to have text_encoder2
/tokenizer2
optional, no?
does it make sense for them to be interchangeable? i.e.
if self.tokenizer is not None and self.tokenizer_2 is not None:
tokenizers = [self.tokenizer, self.tokenizer_2]
elif self.tokenizer is not None:
tokenizers = [self.tokenizer]
else:
tokenizers = [self.tokenizer2]
Yes. I will take a look.
@yiyixuxu I took a look at the changes you suggested warrant a library-wide rewrite of the SDXL encode_prompt()
method in that case, I think.
I think one of the reasons why text_encoder_2
/ tokenizer_2
wasn't optional is because of the refiner component. It doesn't have tokenizer
/ text_encoder
but has text_encoder_2
/ tokenizer_2
.
Okay it should have been StableDiffusionInstructPix2PixPipeline
and not the SDXL variant.
I have the same question, did it resolved?
Describe the bug
Reproduction
This is an official example from HQ-Edit
Logs
No response
System Info
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
Who can help?
No response