-
Hi Hello:
I hope everything is OK. I am tring to use the pretrained model of psmnet to make a prediction. But I got some error as shown blow:
Traceback (most recent call last):
File "predict.…
-
Run all the steps on multiple computers to see if everything is working.
Ideally both cpu and gpu should work fine.
Check the behaviour under different Nvidia driver or Cuda version.
-
-
# Description
Seems like we have a pre-trained model on the KITTI Dataset by the authors [here](https://github.com/JiaRenChang/PSMNet). So there are no problems here hopefully. We will start with im…
-
Hi!
Thanks for sharing such an excellent work.
I am curious about the FLOPs of different networks, Which libs or tools did you use for FLOPs calculation?
Is the FLOPs calculation code a…
-
This issue is related to #27, In case you didn't notice the question in it, I propose a new one.
I have set 1x3x576x960 as input and only take conv layers(include both 2d and 3d conv and transpose co…
-
When I use your simplified version to train, it produced a bad performance
![1108200878](https://user-images.githubusercontent.com/42108203/71723369-63da9500-2e67-11ea-9b7d-8a5488e665db.jpg)
-
such as PSMNet, GANet?
-
Thank you for your great work! I just tried your code and add --count_time to try to see the speed of the model. However, I found there is no torch.cuda.synchronize() after the running of the model. S…
-
Hi, thanks to your code and paper. I just have a question about disparity before loss calculation. In paper you said you first upsampled the disparity to the original resolution, then you use it for t…