Closed amoghskanda closed 5 months ago
@amoghskanda I opened a request here https://github.com/georghess/neurad-studio/issues/21 Do you know how to do actor edits in this neurad codebase? Thanks for your help!
@georghess @atonderski how do I get the lidar rendering for neurad? I went through the render file and there's no option to model the lidar data. The paper mentions lidar simulation is possible but nothing about how it's done. Thanks
@amoghskanda There is a lidar rendering option under reviewer, I played with it, but nothing got rendered. Have you tried it?
I got the lidar rendering working in the viewer, but honestly it looks a bit funky.
I haven't tried it becase I wasn't able to render interpolated trajectory after actor edits using viewer. Viewer only helped in visualization and the in the render using viewer, only the camera sensor was moving as per camera path, rest of the actors were static. I want to render lidar from the codebase and was asking if that's possible. Anyway will checkout the viewer's lidar render
I got the lidar rendering working in the viewer, but honestly it looks a bit funky.
is this a screenshot from the viewer or were you able to download it?
This is a screenshot of the viewer. I looked through the code base, only viewer supports lidar rendering.
There is a LidarRenderer
takes input from viewer control and renders lidar, potentially this can be rewrite to take inputs from the code and render lidar.
https://github.com/georghess/neurad-studio/blob/12745d34426eb732259be35cb7bb4fb04c4df7a1/nerfstudio/viewer/render_state_machine.py#L361-L366
and
Have you tried it?
No. I have not. I want to get some clarity from the authors first.
No. I have not. I want to get some clarity from the authors first.
姐,你的渲染结果是怎么出来的,我训完跑run_viewer.py只显示了训练图片和lidar点云,没有渲染效果
We've only implemented a lidar renderer in the viewer currently, and not as a render-command. For our experiments, we saved point clouds at eval by using the predicted depth and thresholding using the ray drop probability. I'll add a lidar renderer to our to do's, but not sure when it will be done. So, if you implement one, feel free to open a PR :)
@alchemz In what sense do the point clouds look funky? The image looks like you are rendering PCA of features rather than RGB, if that is what you are referring to.
We've only implemented a lidar renderer in the viewer currently, and not as a render-command. For our experiments, we saved point clouds at eval by using the predicted depth and thresholding using the ray drop probability. I'll add a lidar renderer to our to do's, but not sure when it will be done. So, if you implement one, feel free to open a PR :)
@alchemz In what sense do the point clouds look funky? The image looks like you are rendering PCA of features rather than RGB, if that is what you are referring to.
Essentially, what I was trying to do is render lidar point clouds for actors with modified trajectory. I saved the depth and intensity data as .npy files and tried to render lidar pointclouds from that but not sure if that's correct.
To clarify, our point clouds were generated as pc = origin + direction * expected_depth
and then masked pc = pc[existance_prob>thres]
Hi,
I've now added an argument for rendering point clouds in the render script. However, I would view it more as an example. Feel free to open a pull-request if you implement something more sophisticated. :)
Hi,
I've now added an argument for rendering point clouds in the render script. However, I would view it more as an example. Feel free to open a pull-request if you implement something more sophisticated. :)
Hi @carlinds , I find it useful when rendering the point clouds with the script.
In addition, I wonder if you accept a small modification (PR) that plotting as well the gt-lidar points for comparison in the script?
That would be very helpful. Thanks!
@Crescent-Saturn Go ahead and open a PR for any useful features :)
@Crescent-Saturn Go ahead and open a PR for any useful features :)
Yeah, in fact, there will be a PR of waymo dataparser soon which contains this modification as well. :rocket:
Hey!Thanks for making this open-source. I used neuradest for training and I was editing actor trajectories by adding shifts to actors and rendering interpolated videos. The paper mentioned that lidar simulation was possible for neurad. Any insights on how to get the lidar rendering for neuradest?