Traceback (most recent call last):
File "depth_predictor.py", line 50, in predict_depth
depth = self.depth_model(rgb, self.intrinsics.unsqueeze(0))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "anaconda3/envs/cfgs/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".cache/torch/hub/TRI-ML_vidar_main/vidar/arch/networks/perceiver/ZeroDepthNet.py", line 765, in forward
encoded_data = self.encode(
^^^^^^^^^^^^
File ".cache/torch/hub/TRI-ML_vidar_main/vidar/arch/networks/perceiver/ZeroDepthNet.py", line 465, in encode
embeddings = [self.encode_embeddings(data, embeddings, scene, idx=i)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".cache/torch/hub/TRI-ML_vidar_main/vidar/arch/networks/perceiver/ZeroDepthNet.py", line 465, in <listcomp>
embeddings = [self.encode_embeddings(data, embeddings, scene, idx=i)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".cache/torch/hub/TRI-ML_vidar_main/vidar/arch/networks/perceiver/ZeroDepthNet.py", line 498, in encode_embeddings
embeddings = self.merge_embeddings(embeddings, sources)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".cache/torch/hub/TRI-ML_vidar_main/vidar/arch/networks/perceiver/ZeroDepthNet.py", line 305, in merge_embeddings
cat = torch.cat(cat, -1)
^^^^^^^^^^^^^^^^^^
RuntimeError: Sizes of tensors must match except in dimension 2. Expected size 28830 but got size 29234 for tensor number 1 in the list.
The input image is opened from KITTI. The intrinsics is a 3x3 matrix. Why is there a tensor size mismatch?