manurare / 360monodepth

Code release for 360monodepth. With our framework we achieve monocular depth estimation for high resolution 360° images based on aligning and blending perspective depth maps.
https://manurare.github.io/360monodepth/
MIT License
158 stars 32 forks source link

Question about the depth image in Replica360 dataset #12

Closed fangchuan closed 1 year ago

fangchuan commented 1 year ago

Hi 360monodepth team, awesome work! I follow the method to produce my customed replica360 dataset as mentioned in your work. The panoramic rgb image looks ok, but the depth image is likely been truncated cause of inappropriate depth scale or depth image format. Specificly, I use matryodshka-replica360 code to render panoramic depth images, I find the depth image can be naturally unprojected to a unit sphere, the we can get the hole room pointcloud at once.

./ReplicaSDK/ReplicaRendererDataset /media/ziqianbai/BACKPACK_DATA1/Replica_all/replica_v1/room_0/mesh.ply /media/ziqianbai/BACKPACK_DATA1/Replica_all/replica_v1/room_0/textures/ /media/ziqianbai/BACKPACK_DATA1/Replica_all/replica_v1/room_0/glass.sur ../glob/train/room_0_6dof.txt y /media/ziqianbai/BACKPACK_DATA1/Replica_all/replica_for_panonerf/room_0/ 1024 512 ../glob/pro2pos.txt

But the pointcloud seems imprecise, the depth is likely been truncated cause of inappropriate depth scale or depth image format: image

The code I used to unproject the equirect_depth_image :

def test_spherical_depth():
    def get_unit_spherical_map():
        h = 512
        w = 1024
        Theta = np.arange(h).reshape(h, 1) * np.pi / h + np.pi / h / 2
        Theta = np.repeat(Theta, w, axis=1)
        Phi = np.arange(w).reshape(1, w) * 2 * np.pi / w + np.pi / w - np.pi
        Phi = -np.repeat(Phi, h, axis=0)

        X = np.expand_dims(np.sin(Theta) * np.sin(Phi),2)
        Y =  np.expand_dims(np.cos(Theta),2)
        Z = np.expand_dims(np.sin(Theta) * np.cos(Phi),2)
        unit_map = np.concatenate([X,Z,Y],axis=2)

        return unit_map

    depth_img_filepath = '/media/ziqianbai/BACKPACK_DATA1/Replica_all/replica_for_panonerf/room_0/room_0_0000_pos12.png'
    raw_depth_img = Image.open(depth_img_filepath)
    depth_img = ImageOps.grayscale(raw_depth_img)
    depth_img = np.asarray(depth_img)
    # depth_img = np.asarray(Image.open(depth_img_filepath))
    depth_img=np.expand_dims((depth_img*16.0),axis=2)
    pointcloud = depth_img * get_unit_spherical_map()

    o3d_pointcloud = o3d.geometry.PointCloud()
    o3d_pointcloud.points = o3d.utility.Vector3dVector(pointcloud.reshape(-1,3))
    o3d.io.write_point_cloud('/media/ziqianbai/BACKPACK_DATA1/Replica_all/replica_for_panonerf/room_0/room_0_0000_pcl_2.ply', o3d_pointcloud)

I'm sure you probably have similar problems. Could u help me figure out the reason the resulting phenomenon? Thanks in advance! @cr333 @manurare

manurare commented 1 year ago

Hello,

I am not familiar with how Replica was rendered in MatryODShka. In 360monodepth we rendered both RGB and depth using the official Replica repo .

fangchuan commented 1 year ago

Hello,

I am not familiar with how Replica was rendered in MatryODShka. In 360monodepth we rendered both RGB and depth using the official Replica repo .

Does the official Replica repo offers method to render panoramic depth image? I asked the MatryODShka cause this work was cited in your paper

manurare commented 1 year ago

You are right. We created our own renderer based on the official one with custom shaders for panorama rendering. You can find it here. Sorry if it is a bit messy.

fangchuan commented 1 year ago

You are right. We created our own renderer based on the official one with custom shaders for panorama rendering. You can find it here. Sorry if it is a bit messy.

Great, thanks for your help!