Open nullx1337 opened 6 months ago
Hi, it seems that the image hasn't been preprocessed, and the depth image has been incorrectly loaded from the RGB image path, leading to mismatched dimensions. Please run preprocess_image.py
to obtain the depth and normal images.
@MrTornado24 I have the same error.
RuntimeError: Predictions and targets are expected to have the same shape, but got torch.Size([3638]) and torch.Size([3638, 4]).
I tried first with the provided mushroom example and all went fine. Now I try with my own image and it doesn't work.
The preprocess_image.py
has been run. The first image is my input, the three following images are what the code has created. But when I then proceed, the size error occurs.
@nullx1337 Did you find a way around this eventually?
Update, I found what was wrong: I used my original image as the image path throughout, but after preprocessing it, the image path should point to one of the three resulting files, the xxxxxx_rgba.png. A small, but costly mistake, did look passed that the whole time, sorry. In case of your issue, it seems yours points to "/root/DreamCraft3D/inputimges/falcon.jpg", while it should be "/root/DreamCraft3D/inputimges/falcon_rgba.jpg", but you might have found out yourself earlier.
Hey, i try to setup the model on an Ubuntu 22.04 with an RTX 4090.
After running
with following image:
i get this output resulting with this error :
RuntimeError: Predictions and targets are expected to have the same shape, but got torch.Size([16384]) and torch.Size([16384, 3]).
Here is the whole log of the process