Closed nitchith closed 1 year ago
Why do you want to do this? The only thing I can say with confidence is that the code works as-is. If you make that change I'm not sure what will happen.
I just found out the problem of using normalized directions here. Using normalized directions will make some of the sampled points fall before the near plane and will have very few sampled points near the far plane.
@nitchith hi nitchith, I don't know if you have seen the latest discussions about that, I copy my comment on the issue:
Yes, it is true that r=o+td
is shown in the paper, but here t
is distance or transmission time along the ray, not in the direction of -z axis (as shown in the picture of sillsill77's first reply). If you want to use the equation (where d
is normalized direction vector and t
is the linear interpolation of start distance and end distance), t
cannot be simply calculated by near plane and far plane's interpolation. If you still calculate t
through linear interpolation of near plane and far plane, near surface and far surface of the sampling volume are actually parts of two spheres.
If you want you can refer to nerfstudio's implementation:
but the code in the repo:
actually shows us r=o+z_vals*rays_d
.
and rays_d
is:
you can see that dirs
is view direction vectors in camera space, and rays_d
is the corresponding value in world space, so if we think of the problem in the camera space, dirs
's 3rd dim is -1, which means the equation r=o+z_vals*rays_d
just samples points linearly along the -z axis, from the near plane to the far plane.
Hi,
While sampling points along rays, the code uses
rays.directions
for the direction vectors instead ofrays.viewdirs
. https://github.com/google/mipnerf/blob/84c969e0a623edd183b75693aed72a7e7c22902d/internal/models.py#L70-L81Original NeRF uses normalized direction vectors for the sampling points. Can you clarify if we need to replace
rays.directions
withrays.viewdirs
?