PJLab-ADG / neuralsim

neuralsim: 3D surface reconstruction and simulation based on 3D neural rendering.
MIT License
598 stars 31 forks source link

lidar points xyz in debug scene #24

Open alanxu89 opened 1 year ago

alanxu89 commented 1 year ago

If lidar rays_o and rays_d are all defined in world coordinate, then why do we still need to do a transformation here? If I dont comment out this line, then the result for my own data seems very wrong in debug scene:

https://github.com/PJLab-ADG/neuralsim/blob/19b5b33113d09676bc72dca7c94b640c73d99710/app/resources/scenes.py#L963

Geniussh commented 12 months ago

I believe the the author made a mistake in README. The coordinate system used to save the lidar npz files are still the local coordinate per frame, rather than the global frame. In fact, I cannot find the code to transform rays_o and rays_d to the world coordinate in preprocess.py: https://github.com/PJLab-ADG/neuralsim/blob/19b5b33113d09676bc72dca7c94b640c73d99710/dataio/autonomous_driving/waymo/preprocess.py#L334-L364 nor in waymo_dataset.py but they are there in the README above.

Someone also confirmed the lidar data to be saved should be in local coordinate in https://github.com/PJLab-ADG/neuralsim/issues/17#issuecomment-1705827360.

And the unit test code for lidar data in waymo_dataset.py also confirmed those saved are in local coords: https://github.com/PJLab-ADG/neuralsim/blob/19b5b33113d09676bc72dca7c94b640c73d99710/dataio/autonomous_driving/waymo/waymo_dataset.py#L670-L672

But after reading a clearer explanation in https://github.com/PJLab-ADG/neuralsim/issues/11#issuecomment-1702160756, I think the issue is when the lidar data are stored in LiDAR coordinate, then we do need l2w provided as a non-identity matrix but when the lidar data are already stored in world coordinate, then l2w should just be identity. So in your case, maybe you should try changing you l2w to identity since your data are already in world coordinate.