Open zesameri opened 1 year ago
Make sure the size if the width/height is a 64 step (512, 576, 640...)
Why should the size / aspect ratio of the image be compromised...?
not sure if still relevant but I was also getting a similar error: Sizes of tensors must match except in dimension 1. Expected size 12 but got size 2 for tensor number 1 in the list
and it was due to prompt
being passed to my pipe call() being an array of strings instead of a single string. Eg:
images = pipe(
prompt=prompt, # needs to be a single string
num_images_per_prompt=args.number, # create multiple images with this arg
image=image_person,
mask_image=image_mask,
guidance_scale=7.5,
generator=generator,
).images
Make sure the size if the width/height is a 64 step (512, 576, 640...)
you are really wise man
Make sure the size if the width/height is a 64 step (512, 576, 640...)
how do i do that?
Make sure the size if the width/height is a 64 step (512, 576, 640...)
how do i do that?
In the pipe, pass width=XXX and height=YYY
Make sure the size if the width/height is a 64 step (512, 576, 640...)
your comment should be the actual error message lol
Does it have to be 64? I tried 32 and it worked. What's the reason for this number?
Multiplications of 8 usually work, but i would stick to 64/128 etc. it is also advised to keep the total size close to the original resolution, so don't use 1024 by 512 on a 512 by 512 model for example as it might introduce artifacts.
On Sun, Jun 16, 2024, 9:28 PM Sivan Ding @.***> wrote:
Does it have to be 64? I tried 32 and it worked. What's the reason for this number?
— Reply to this email directly, view it on GitHub https://github.com/CompVis/stable-diffusion/issues/301#issuecomment-2171800223, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABITHJHLVFSLJHD2OTV4GYDZHXKLBAVCNFSM6AAAAAAQQK46VGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNZRHAYDAMRSGM . You are receiving this because you commented.Message ID: @.***>
(my code) samples = sampler.decode(z_enc, c, t_enc, unconditional_guidance_scale=opt.scale) stack trace w/o pytorch forward and grad calls File "/home/model-server/stable-diffusion/ldm/models/diffusion/ddim.py", line 238, in decode xdec, = self.p_sample_ddim(x_dec, cond, ts, index=index, use_original_steps=use_original_steps) File "/home/model-server/stable-diffusion/ldm/models/diffusion/ddim.py", line 177, in p_sample_ddim e_t_uncond, e_t = self.model.apply_model(x_in, t_in, c_in).chunk(2) File "/home/model-server/stable-diffusion/ldm/models/diffusion/ddpm.py", line 987, in apply_model x_recon = self.model(x_noisy, t, **cond) File "/home/model-server/stable-diffusion/ldm/models/diffusion/ddpm.py", line 1410, in forward out = self.diffusion_model(x, t, context=cc) File "/home/model-server/stable-diffusion/ldm/modules/diffusionmodules/openaimodel.py", line 736, in forward h = th.cat([h, hs.pop()], dim=1) RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 16 but got size 15 for tensor number 1 in the list.