Closed Leon-Williams9 closed 3 days ago
Recently, I'm rushing a paper about 360 multi-viewpoint reconstruction with varying camera baselines for CVPR2025. Please send me email after that. At that time, I will try to give you a new pretrained model along with test code, which predict much better depth than PanoGRF.
My email is zhengchenecho@gmail.com
Thank you for your quick response. If I continue to work on this line, I may contact you via email weeks later. I wish you smooth writing and successful submission.
Great work! Thank you for open-sourcing this impressive research.
I noticed that in your paper, you mention testing depth estimation on the Matterport3D dataset with good results, but the data appears to be generated from a Habitat environment. I’m interested in testing multi-view stereo depth estimation on real-world datasets, specifically by testing directly on panoramic images from the Matterport3D 360 dataset or the Stanford2D3D dataset. These datasets provide color panoramic images, real depth values, and true camera poses, which meet the model's requirements .
I’ve written a script to load the datasets based on your code and attempted testing on the 18 test scenes of the Matterport3D 360 dataset. However, the results didn’t meet my expectations. It seems that there is a discrepancy between the scale of the predicted values and the actual scale.I’ve attached the evaluation metrics groundtruth depth and pred depth visual results below for your reference.
I suspect the issue might be due to the following reasons:
Pose reading errors: I thought there might be issues with how I read the pose data, but I have carefully checked the projection in the cost volume and the coordinate transformation code, and I believe my input pose data is correct.
Variable baseline in real datasets: I noticed that in real datasets, the camera baseline isn’t fixed, and I recall you mentioned the baseline issue in your paper during each test. I’m wondering if the model struggles to adapt to varying baselines.
I would greatly appreciate any guidance or suggestions you might have regarding this issue.
Thank you again for your excellent work!