Open skrya opened 3 months ago
In the README.md file, I list the command to test on the KITTI test set and also save the dense depth maps for submission. The command is
python test.py gpus=[0] name=BP_KITTI ++chpt=BP_KITTI \
net=PMP num_workers=4 \
data=KITTI data.testset.mode=test data.testset.height=352 \
test_batch_size=1 metric=RMSE ++save=true
You should find a "results" directory containing depth completion results after code runs.
Thanks for the response. I am curios to know on how I can obtain the depth completion output given a single random sparse depth map (along with the corresponding image) such as the one shown above. I want to try it out beyond KITTI.
The simplest way is to save your RGB and sparse depth map as the same format as KITTI (also with the same directory path), that you can directly use the above command without the need to rewrite a dataloader.
Got it. Thanks!
Can you please let me know if I can get a depth completion results with just sparse depth map and without corresponding RGB image, intrinsics or lidar points available?
Our method is for image-guided depth completion, in chich RGB image is required. There are some papers working on LiDAR-only depth completion, which may be suitable for you.
Thanks for your response.
Dear Authors,
Thanks for the amazing work!!
Can you please guide me on how I can achieve depth completion for a KITTI image which looks like the below one.![000380_up](https://github.com/kakaxi314/BP-Net/assets/7305388/758699b5-c009-4d4b-9ddb-72f80b2149b1)
Best!