yanxian-ll / GS-SR

GS-SR: Gaussian Splatting for Surface Reconstruction
54 stars 1 forks source link

Loss = NaN error during training #3

Open LaFeuilleMorte opened 3 weeks ago

LaFeuilleMorte commented 3 weeks ago

Hi, I'm using your octree-pgsr to train my dataset. But I encountered with loss = NaN error during training. My config was: python train.py octree-pgsr \ --source-path {My_DATASET} --output-path {OUTPUT} \ --scene.dataloader.resolution 1 \ --scene.dataloader.resolution-scales 1 2 4 8 \ --scene.dataloader.device "cpu" \ --trainer.iterations 60000 \ --trainer.save-iterations 10000 20000 30000 40000 50000 60000 \ --trainer.test-iterations 10000 20000 30000 40000 50000 60000

yanxian-ll commented 3 weeks ago

Hi, I'm using your octree-pgsr to train my dataset. But I encountered with loss = NaN error during training. My config was: python train.py octree-pgsr --source-path {My_DATASET} --output-path {OUTPUT} --scene.dataloader.resolution 1 --scene.dataloader.resolution-scales 1 2 4 8 --scene.dataloader.device "cpu" --trainer.iterations 60000 --trainer.save-iterations 10000 20000 30000 40000 50000 60000 --trainer.test-iterations 10000 20000 30000 40000 50000 60000

I am having difficulty reproducing this issue. When I ran the experiment using the same commands as you, I did not encounter the loss=None error. Could you provide a subset of the data you used?

LaFeuilleMorte commented 3 weeks ago

Hi, I'm using your octree-pgsr to train my dataset. But I encountered with loss = NaN error during training. My config was: python train.py octree-pgsr --source-path {My_DATASET} --output-path {OUTPUT} --scene.dataloader.resolution 1 --scene.dataloader.resolution-scales 1 2 4 8 --scene.dataloader.device "cpu" --trainer.iterations 60000 --trainer.save-iterations 10000 20000 30000 40000 50000 60000 --trainer.test-iterations 10000 20000 30000 40000 50000 60000

I am having difficulty reproducing this issue. When I ran the experiment using the same commands as you, I did not encounter the loss=None error. Could you provide a subset of the data you used?

[Uploading livingroom_3_ds10_sequential.zip…]() Here's my subdataset.

yanxian-ll commented 3 weeks ago

The download link seems to not work.

Hi, I'm using your octree-pgsr to train my dataset. But I encountered with loss = NaN error during training. My config was: python train.py octree-pgsr --source-path {My_DATASET} --output-path {OUTPUT} --scene.dataloader.resolution 1 --scene.dataloader.resolution-scales 1 2 4 8 --scene.dataloader.device "cpu" --trainer.iterations 60000 --trainer.save-iterations 10000 20000 30000 40000 50000 60000 --trainer.test-iterations 10000 20000 30000 40000 50000 60000

I am having difficulty reproducing this issue. When I ran the experiment using the same commands as you, I did not encounter the loss=None error. Could you provide a subset of the data you used?

Uploading livingroom_3_ds10_sequential.zip… Here's my subdataset.

LaFeuilleMorte commented 3 weeks ago

Emm, I had some problem uploading files. Maybe you can send me your email so I can email this to you

yanxian-ll commented 3 weeks ago

Emm, I had some problem uploading files. Maybe you can send me your email so I can email this to you here is my email: yanxiancsu@csu.edu.cn