georghess / neurad-studio

[CVPR2024] NeuRAD: Neural Rendering for Autonomous Driving
https://research.zenseact.com/publications/neurad/
Apache License 2.0
346 stars 24 forks source link

Lidar rendering for neuradest #20

Closed amoghskanda closed 5 months ago

amoghskanda commented 6 months ago

Hey!Thanks for making this open-source. I used neuradest for training and I was editing actor trajectories by adding shifts to actors and rendering interpolated videos. The paper mentioned that lidar simulation was possible for neurad. Any insights on how to get the lidar rendering for neuradest?

alchemz commented 6 months ago

@amoghskanda I opened a request here https://github.com/georghess/neurad-studio/issues/21 Do you know how to do actor edits in this neurad codebase? Thanks for your help!

amoghskanda commented 6 months ago

@georghess @atonderski how do I get the lidar rendering for neurad? I went through the render file and there's no option to model the lidar data. The paper mentions lidar simulation is possible but nothing about how it's done. Thanks

alchemz commented 6 months ago
Screenshot 2024-05-16 at 9 58 06 PM

@amoghskanda There is a lidar rendering option under reviewer, I played with it, but nothing got rendered. Have you tried it?

alchemz commented 6 months ago

I got the lidar rendering working in the viewer, but honestly it looks a bit funky.

Screenshot 2024-05-16 at 10 05 38 PM
amoghskanda commented 6 months ago

I haven't tried it becase I wasn't able to render interpolated trajectory after actor edits using viewer. Viewer only helped in visualization and the in the render using viewer, only the camera sensor was moving as per camera path, rest of the actors were static. I want to render lidar from the codebase and was asking if that's possible. Anyway will checkout the viewer's lidar render

amoghskanda commented 6 months ago

I got the lidar rendering working in the viewer, but honestly it looks a bit funky.

Screenshot 2024-05-16 at 10 05 38 PM

is this a screenshot from the viewer or were you able to download it?

alchemz commented 6 months ago

This is a screenshot of the viewer. I looked through the code base, only viewer supports lidar rendering.

There is a LidarRenderer takes input from viewer control and renders lidar, potentially this can be rewrite to take inputs from the code and render lidar. https://github.com/georghess/neurad-studio/blob/12745d34426eb732259be35cb7bb4fb04c4df7a1/nerfstudio/viewer/render_state_machine.py#L361-L366

amoghskanda commented 6 months ago

and

Have you tried it?

alchemz commented 6 months ago

No. I have not. I want to get some clarity from the authors first.

FCInter commented 6 months ago

No. I have not. I want to get some clarity from the authors first.

姐,你的渲染结果是怎么出来的,我训完跑run_viewer.py只显示了训练图片和lidar点云,没有渲染效果

georghess commented 5 months ago

We've only implemented a lidar renderer in the viewer currently, and not as a render-command. For our experiments, we saved point clouds at eval by using the predicted depth and thresholding using the ray drop probability. I'll add a lidar renderer to our to do's, but not sure when it will be done. So, if you implement one, feel free to open a PR :)

@alchemz In what sense do the point clouds look funky? The image looks like you are rendering PCA of features rather than RGB, if that is what you are referring to.

amoghskanda commented 5 months ago

We've only implemented a lidar renderer in the viewer currently, and not as a render-command. For our experiments, we saved point clouds at eval by using the predicted depth and thresholding using the ray drop probability. I'll add a lidar renderer to our to do's, but not sure when it will be done. So, if you implement one, feel free to open a PR :)

@alchemz In what sense do the point clouds look funky? The image looks like you are rendering PCA of features rather than RGB, if that is what you are referring to.

Essentially, what I was trying to do is render lidar point clouds for actors with modified trajectory. I saved the depth and intensity data as .npy files and tried to render lidar pointclouds from that but not sure if that's correct.

georghess commented 5 months ago

To clarify, our point clouds were generated as pc = origin + direction * expected_depth and then masked pc = pc[existance_prob>thres]

carlinds commented 5 months ago

Hi,

I've now added an argument for rendering point clouds in the render script. However, I would view it more as an example. Feel free to open a pull-request if you implement something more sophisticated. :)

Crescent-Saturn commented 5 months ago

Hi,

I've now added an argument for rendering point clouds in the render script. However, I would view it more as an example. Feel free to open a pull-request if you implement something more sophisticated. :)

Hi @carlinds , I find it useful when rendering the point clouds with the script.

In addition, I wonder if you accept a small modification (PR) that plotting as well the gt-lidar points for comparison in the script?

That would be very helpful. Thanks!

georghess commented 5 months ago

@Crescent-Saturn Go ahead and open a PR for any useful features :)

Crescent-Saturn commented 5 months ago

@Crescent-Saturn Go ahead and open a PR for any useful features :)

Yeah, in fact, there will be a PR of waymo dataparser soon which contains this modification as well. :rocket: