wchang22 / ReSTIR_DR

Source Code for SIGGRAPH 2023 Paper "Parameter-space ReSTIR for Differentiable and Inverse Rendering"
https://weschang.com/publications/restir-dr/
Other
54 stars 6 forks source link

Any advice on how to use this for inverse rendering of talking heads? #3

Open oijoijcoiejoijce opened 1 year ago

oijoijcoiejoijce commented 1 year ago

Super cool work; an earlier paper (deferred neural rendering: image synthesis using neural textures) did this for talking heads, any insights on how to use your code/methodology to render photorealistic portrait?

wchang22 commented 1 year ago

Hi, thanks for your interest in this work! Our method is a general strategy for accelerating inverse rendering, although the current implementation is limited to materials in physically-based pipelines.

If you simply want to, for example, use it recover texture maps for a static face for view synthesis, this Mitsuba implementation should work (although you'll need multiple views, which I have not released in this public version yet).

If you want to instead use the method in the paper you mentioned, our algorithm should work on the neural textures as well, although it may require some additional work. The key idea is that our method accelerates inverse rendering by reusing derivatives of the loss with respect to texels. In that paper, this would be the derivative of the loss with respect to the neural feature texels.