damian0815 / compel

A prompting enhancement library for transformers-type text embedding systems
MIT License
519 stars 47 forks source link

Error when running compel-demo-sdxl.py example #58

Closed yegorsw closed 1 year ago

yegorsw commented 1 year ago

I've attempted to run the compel-demo-sdxl.py script, but after the pipeline components are all loaded it fails with this traceback:


Traceback (most recent call last):
  File "/home/yegor/git/compel/compel-demo-sdxl.py", line 56, in <module>
    images = run_and_batched()
  File "/home/yegor/git/compel/compel-demo-sdxl.py", line 43, in run_and_batched
    embeds, pooled = compel([pp, np])
  File "/home/yegor/git/compel/venv/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/home/yegor/git/compel/venv/lib/python3.10/site-packages/compel/compel.py", line 141, in __call__
    cond_tensor = self.pad_conditioning_tensors_to_same_length(conditionings=cond_tensor)
  File "/home/yegor/git/compel/venv/lib/python3.10/site-packages/compel/compel.py", line 260, in pad_conditioning_tensors_to_same_length
    return type(self)._pad_conditioning_tensors_to_same_length(conditionings, emptystring_conditioning=emptystring_conditioning)
  File "/home/yegor/git/compel/venv/lib/python3.10/site-packages/compel/compel.py", line 227, in _pad_conditioning_tensors_to_same_length
    raise ValueError(f"All conditioning tensors must have the same batch size ({c0_shape[0]}) and number of embeddings per token ({c0_shape[1]})")
ValueError: All conditioning tensors must have the same batch size (1) and number of embeddings per token (77)```
yegorsw commented 1 year ago

Please ignore, was using pip3 install version of compel instead of compel cloned from the git repo.