Closed vict0rsch closed 3 years ago
This is due to task d
weird shape torch.Size([1, 1, 1000, 1000, 4])
same error:
import torch.nn.functional as F
import torch
F.interpolate(torch.randn(1, 1, 1000, 1000, 4), size=(100, 100), mode="nearest")
RuntimeError: It is expected output_size equals to 3, but got size 2
Why do we have this input shape?
Why do we have this input shape?
I was going to ask the same question...
ok i think I know what is happening. What did you use for the depth in WD ? If you used megadepth prediction, then that's where the problem comes from. Right now, all simulated depth data is read as though it was coming from Unity simulator as 3 channel images.
Ok, it's a shame we don't have WD depth as it's what brought @tianyu-z 's best performance (on beheaded omnigan :p )
@vict0rsch Sorry for the confusion, in the #12 experiment I didn't include the WD data. Just to make everything clear, when you open the link: here , you will see two parts of the form. The experiments under opt.lr were not trained by the WD data. Those experiments under decoder were trained by WD data.