nv-tlabs / NKSR

[CVPR 2023 Highlight] Neural Kernel Surface Reconstruction
https://research.nvidia.com/labs/toronto-ai/NKSR
Other
765 stars 44 forks source link

Question on the renderer #11

Closed AmmonZ closed 1 year ago

AmmonZ commented 1 year ago

Great work! I want to ask about how to render the demo that you show in the presentation ppt, e.g. the dynamic scenes with zoom in, the static objects render and the sliced view with implicit fields.

heiwang1997 commented 1 year ago

Hi, thanks for your interest in our work. To render the scenes we used the presets from here.

You can refer to the following code for rendering:

from pycg.isometry import Isometry
from pycg import vis, render, image, color

def demo_render_bunny():
    scene = render.Scene(up_axis='+Y').add_object(bunny_y)
    scene.quick_camera(w=600, h=600, plane_angle=280.0)
    scene.preview(use_new_api=True)

    # Render using NKSR style
    render.ThemeNKSR(need_plane=True).apply_to(scene)
    nksr_rendering = scene.render_blender()
    nksr_rendering = image.alpha_compositing(
        image.gamma_transform(nksr_rendering, alpha_only=True, gamma=3.0),
        image.solid(nksr_rendering.shape[1], nksr_rendering.shape[0]))

    image.show(nksr_rendering)

if __name__ == '__main__':
    # Load example meshes
    bunny_y = vis.from_file("assets/bunny.obj")
    chair_y = vis.from_file("assets/chair.ply")
    chair_z = Isometry.from_axis_angle('+X', 90.0) @ chair_y
    demo_render_bunny()

For the large scene we were using NVIDIA omniverse. You can refer to https://www.nvidia.com/en-us/omniverse/ for more information.

heiwang1997 commented 1 year ago

Closing due to inactivity. Feel free to re-open if you still have a problem.