Closed qsobad closed 4 years ago
ReLU is applied to alpha, just in a different code location: https://github.com/bmild/nerf/blob/8edde335d2b18188769850b03c45515352d66b31/run_nerf.py#L110
ahhh,,, ic. it's the same if i add the relu after the linear layer, right?
Yep, it's the same.
thanks man!
I wonder why the alpha_out is not rectified by ReLU which is mentioned in Fig.7
alpha_out = dense(1, act=None)(outputs)