Hi Dr.Yue,
I am new in this area. And I have complied the previous steps. When trying your 'train_gaussianhead.py', error occurs: "CUDA error: an illegal memory access was encountered". Then I localized the problem was in "CameraModule.py", line 189, in render_gaussian.
Next I check the input of render_gaussian using
"""
print(f"\nmeans3D shape: {means3D.shape}, means2D shape: {means2D.shape}")
print(f"\ncolors_precomp shape: {colors_precomp[b].shape}, opacities shape: {opacity[b].shape}")
print(f"\nscales shape: {scales[b].shape}, rotations shape: {rotations[b].shape}")
rotations sample: tensor([[0., nan, nan, nan],
[0., nan, nan, nan]], device='cuda:0', grad_fn=)
"""
There are lots of Nan. Is this normal?
Could you please help me with the solution? Thanks a lot.
I met same error, try to train meshhead again or rollback meshhead checkpoint works for me.
I have no idea about the cause of this error, it happens sometimes.
:D
Hi Dr.Yue, I am new in this area. And I have complied the previous steps. When trying your 'train_gaussianhead.py', error occurs: "CUDA error: an illegal memory access was encountered". Then I localized the problem was in "CameraModule.py", line 189, in render_gaussian.
Next I check the input of render_gaussian using """ print(f"\nmeans3D shape: {means3D.shape}, means2D shape: {means2D.shape}") print(f"\ncolors_precomp shape: {colors_precomp[b].shape}, opacities shape: {opacity[b].shape}") print(f"\nscales shape: {scales[b].shape}, rotations shape: {rotations[b].shape}")
""", I found this: """ means3D shape: torch.Size([143961, 3]), means2D shape: torch.Size([143961, 3])
colors_precomp shape: torch.Size([143961, 32]), opacities shape: torch.Size([143961, 1])
scales shape: torch.Size([143961, 3]), rotations shape: torch.Size([143961, 4])
means3D sample: tensor([[ 0.0186, -0.0408, -0.1247], [ 0.0231, -0.0348, -0.1314]], device='cuda:0', grad_fn=)
means2D sample: tensor([[0., 0., 0.], [0., 0., 0.]], device='cuda:0', grad_fn=)
colors_precomp sample: tensor([[nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan], [nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]], device='cuda:0', grad_fn=)
opacities sample: tensor([[nan], [nan]], device='cuda:0', grad_fn=)
scales sample: tensor([[nan, nan, nan], [nan, nan, nan]], device='cuda:0', grad_fn=)
rotations sample: tensor([[0., nan, nan, nan], [0., nan, nan, nan]], device='cuda:0', grad_fn=)
"""
There are lots of Nan. Is this normal?
Could you please help me with the solution? Thanks a lot.