When I try to use this code with he sdxl model I get this:
Token indices sequence length is longer than the specified maximum sequence length for this model (176 > 77). Running this sequence through the model will result in indexing errors
The code I use is:
def prompt_worker_sdxl(prompt,negative_prompt,pipe_sdxl):
compel = Compel(tokenizer=[pipe_sdxl.tokenizer, pipe_sdxl.tokenizer_2],
text_encoder=[pipe_sdxl.text_encoder, pipe_sdxl.text_encoder_2],
returned_embeddings_type=ReturnedEmbeddingsType.PENULTIMATE_HIDDEN_STATES_NON_NORMALIZED,
requires_pooled=[False, True],
truncate_long_prompts=False)
with torch.no_grad():
conditioning, pooled = compel([prompt,negative_prompt])
return conditioning, pooled
When I try to use this code with he sdxl model I get this: Token indices sequence length is longer than the specified maximum sequence length for this model (176 > 77). Running this sequence through the model will result in indexing errors The code I use is:
the pipelines I've tested are:
with this schedulers:
compel is 2.0.2