When I install nerfacc with torch2.0.1, render_transmittance_from_density crashed. When I read the code, I find that it is caused by the implemention of render_transmittance_from_density, also listed bellow:
`
def render_transmittance_from_density(
t_starts: Tensor,
t_ends: Tensor,
sigmas: Tensor,
packed_info: Optional[Tensor] = None,
ray_indices: Optional[Tensor] = None,
n_rays: Optional[int] = None,
prefix_trans: Optional[Tensor] = None,
) -> Tuple[Tensor, Tensor]:
if not is_cub_available() and packed_info is None:
Convert ray indices to packed info
packed_info = pack_info(ray_indices, n_rays)
ray_indices = None
sigmas_dt = sigmas * (t_ends - t_starts)
alphas = 1.0 - torch.exp(-sigmas_dt)
trans = torch.exp(
-exclusive_sum(sigmas_dt, packed_info=packed_info, indices=ray_indices)
)
if prefix_trans is not None:
trans = trans * prefix_trans
return trans, alphas
`
When I input the required arguments t_starts, t_ends, sigmas; However the code step into the the packed_info = pack_info(ray_indices, n_rays), while ray_indices is None and error occurs.
When I install nerfacc with torch2.0.1, render_transmittance_from_density crashed. When I read the code, I find that it is caused by the implemention of render_transmittance_from_density, also listed bellow:
` def render_transmittance_from_density( t_starts: Tensor, t_ends: Tensor, sigmas: Tensor, packed_info: Optional[Tensor] = None, ray_indices: Optional[Tensor] = None, n_rays: Optional[int] = None, prefix_trans: Optional[Tensor] = None, ) -> Tuple[Tensor, Tensor]: if not is_cub_available() and packed_info is None:
Convert ray indices to packed info
`
When I input the required arguments t_starts, t_ends, sigmas; However the code step into the the
packed_info = pack_info(ray_indices, n_rays)
, while ray_indices is None and error occurs.It seems that the latest code has some bug? `