Closed mikebilly closed 7 months ago
I think the image's H and W res need to be a multiple of 64 or something like that
Hello, this error should be resolved now. You can play around with different resolutions now by passing the processing_res also to the command. Have fun!
Traceback (most recent call last): File "/content/depth-fm/inference.py", line 134, in
main(args)
File "/content/depth-fm/inference.py", line 89, in main
depth = model.predict_depth(im, num_steps=args.num_steps, ensemble_size=args.ensemble_size)
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(args, kwargs)
File "/content/depth-fm/depthfm/dfm.py", line 97, in predict_depth
return self.forward(ims, num_steps, ensemble_size)
File "/content/depth-fm/depthfm/dfm.py", line 81, in forward
depth_z = self.generate(x_source, num_steps=num_steps, context=context, context_ca=conditioning)
File "/content/depth-fm/depthfm/dfm.py", line 51, in generate
ode_results = odeint(ode_fn, z, t, ode_kwargs)
File "/usr/local/lib/python3.10/dist-packages/torchdiffeq/_impl/odeint.py", line 77, in odeint
solution = solver.integrate(t)
File "/usr/local/lib/python3.10/dist-packages/torchdiffeq/_impl/solvers.py", line 105, in integrate
dy, f0 = self._step_func(self.func, t0, dt, t1, y0)
File "/usr/local/lib/python3.10/dist-packages/torchdiffeq/_impl/fixed_grid.py", line 10, in _step_func
f0 = func(t0, y0, perturb=Perturb.NEXT if self.perturb else Perturb.NONE)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
return self._call_impl(args, kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1520, in _call_impl
return forward_call(args, kwargs)
File "/usr/local/lib/python3.10/dist-packages/torchdiffeq/_impl/misc.py", line 189, in forward
return self.base_func(t, y)
File "/content/depth-fm/depthfm/dfm.py", line 34, in ode_fn
return self.model(x=x, t=t, kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
return self._call_impl(args, kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
File "/content/depth-fm/depthfm/unet/openaimodel.py", line 841, in forward
h = th.cat([h, hs.pop()], dim=1)
RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 20 but got size 19 for tensor number 1 in the list.