ForMyCat / SparseGS

Codebase for SparseGS paper
Other
119 stars 14 forks source link

About reg_loss #11

Closed gotoHappy closed 5 months ago

gotoHappy commented 5 months ago
if pick_warp_cam:
            reg_Ll1 = mask_l1_loss(warp_image, reg_gt_image, reg_mask)
            reg_loss = (1.0 - opt.lambda_dssim) * reg_Ll1 + opt.lambda_dssim * (1.0 - ssim(warp_image, reg_gt_image))
            loss += dataset.lambda_reg * reg_loss 

Thank you for your work! I noticed that there is a 'reg_loss' in the code you provided in 'train.py', but I couldn't find a corresponding explanation in your paper (perhaps I didn't read it carefully enough). Could you please explain this loss function to me?

Moreover, the following output results seem to be related to it. Can you explain these results?

Warping 0 to -0.025 dp min: 2.99281 dp max: 39.00119 [13/06 00:33:59]
Warping 1 to 0.975 dp min: 2.12648 dp max: 31.35087 [13/06 00:34:45]
Warping 2 to 1.975 dp min: 2.1191 dp max: 17.63157 [13/06 00:35:31]
...
ForMyCat commented 5 months ago

Sorry for the naming. Reg_loss is the image reprojection loss mentioned in the paper.On Jun 12, 2024, at 10:00, Zeyu @.**> wrote: if pick_warp_cam: reg_Ll1 = mask_l1_loss(warp_image, reg_gt_image, reg_mask) reg_loss = (1.0 - opt.lambda_dssim) reg_Ll1 + opt.lambda_dssim (1.0 - ssim(warp_image, reg_gt_image)) loss += dataset.lambda_reg reg_loss Thank you for your work! I noticed that there is a 'reg_loss' in the code you provided in 'train.py', but I couldn't find a corresponding explanation in your paper (perhaps I didn't read it carefully enough). Could you please explain this loss function to me?

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you are subscribed to this thread.Message ID: @.***>

gotoHappy commented 5 months ago

Ah,I see!

Warping 0 to -0.025 dp min: 2.99281 dp max: 39.00119 [13/06 00:33:59]
Warping 1 to 0.975 dp min: 2.12648 dp max: 31.35087 [13/06 00:34:45]
Warping 2 to 1.975 dp min: 2.1191 dp max: 17.63157 [13/06 00:35:31]
...

Can you help me explain these outputs during the training process? Generating these results seems to take a lot of time, and BaseGS does not have these outputs. Thank you very much for patiently answering.

ForMyCat commented 5 months ago

"Warping 0 to -0.025 dp min: 2.99281 dp max: 39.00119" means warping the image with camera ID 0 around the average up axis by 2.5 degrees to generate a pseudo training image with camera ID -0.025. dp_min and dp_max correspond to the rendered minimum and maximum depth values from camera 0 in COLMAP unit. Basically, we scale the monocular estimated depth maps with these rendered min and max values to align them with the COLMAP unit, then warp the training images using the scaled depth maps to generate pseudo gt images at the target camera positions.

It is slow since we implemented this process with numpy (on CPU). We may switch to pytorch and add GPU support in future revision. If this is taking too long for you, you can disable it by setting lambda_reg =

  1. Thank you for your interest in our paper! If it helps with your project, please consider citing us. :-)

On Wed, Jun 12, 2024 at 5:53 PM Zeyu @.***> wrote:

Ah,I see!

Warping 0 to -0.025 dp min: 2.99281 dp max: 39.00119 [13/06 00:33:59] Warping 1 to 0.975 dp min: 2.12648 dp max: 31.35087 [13/06 00:34:45] Warping 2 to 1.975 dp min: 2.1191 dp max: 17.63157 [13/06 00:35:31] ...

Can you help me explain these outputs during the training process? Generating these results seems to take a lot of time, and BaseGS does not have these outputs. Thank you very much for patiently answering.

— Reply to this email directly, view it on GitHub https://github.com/ForMyCat/SparseGS/issues/11#issuecomment-2164155561, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALV4NNCD2ZN64YBAQGK6V5TZHDURVAVCNFSM6AAAAABJGYHIPGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRUGE2TKNJWGE . You are receiving this because you commented.Message ID: @.***>

gotoHappy commented 5 months ago

Thanks for your answer, I have this question because I didn't see your updated paper. Now I understand it!