droid-dataset / droid

Distributed Robot Interaction Dataset.
https://droid-dataset.github.io/droid/
119 stars 24 forks source link

How to get world coordinate point cloud #15

Open HaoyiZhu opened 5 months ago

HaoyiZhu commented 5 months ago

Hi, thanks for the amazing work!

I'm wondering if you could kindly offer a demo script about how to get multi-view aligned point cloud in world coordinate?

I have tried, but I found the multi-view point clouds cannot be aligned well. I'm not sure if the camera poses are inaccurate.

I directly read intrinsics from your sov reader script, and convert extrinsic matrix using the following function:

def ext2mat(ext):
    extrinsic = np.eye(4)
    extrinsic[:3, :3] = Rotation.from_euler("xyz", ext[3:]).as_matrix()
    extrinsic[:3, 3] = ext[:3]
    return extrinsic

The results seem not good:

image

Thanks!

AlexanderKhazatsky commented 5 months ago

Some of the camera views are not perfectly calibrated. We're soon releasing some metadata which states which views are correctly calibrated within an error threshold and which are not, and with calibration values that are refined for multi-view alignment

ElectronicElephant commented 3 months ago

Hi @AlexanderKhazatsky , I'm just wondering how is the metadata going on? Is there any ETA of it? Great thanks for your effort!

AlexanderKhazatsky commented 3 months ago

Sorry for the delay on this, we’re just waiting for some TRI legal permissions to go through so they can share everything with us. We’re expecting that to take one more week, and processing everything into RLDS format to take an additional week.

In the mean time, we can probably share the updated calibration values if that is sufficient for your purposes?

AlexanderKhazatsky commented 3 months ago

But the depth won't be ready for a little bit longer...

jiaqchen commented 2 months ago

Hi @AlexanderKhazatsky, if I'm understanding correctly, it would be much appreciated if you could share updated camera intrinsic/extrinsic calibration values! I've been visualizing some of the data and it seems like there are some with wonky camera extrinsics. Thanks!

AlexanderKhazatsky commented 2 months ago

What I can share is some cam2cam transformations as well as some code for confirming which cam2base values are consistent, but we are still running it on the dataset to assess which trajectories are usable. So my guess is it's in your best interest to wait for us to finish that up, unless you want to help with the process to hurry it up!