wzhshi / SCSNet

The training codes and some pre-trained models for my CVPR2019 paper "Scalable Convolutional Neural Network for Image Compressed Sensing".
19 stars 5 forks source link

How does the training work? #3

Open stephenllh opened 3 years ago

stephenllh commented 3 years ago

I want to ask a question to clarify certain training details. In the paper, you mentioned that "Suppose the reconstruction network has T initial reconstructions and T final reconstructions, we have 2T objectives to minimize." Does that mean the "initial reconstruction network" and "deep reconstruction network" are trained with MSE loss separately, although the input of "deep" is the output of "initial"?

wzhshi commented 3 years ago

They are trained jointly. Please see my codes for more details.

2021-03-29 18:20:39 "Stephen Lau" @.***> 写道:

I want to ask a question to clarify certain training details. In the paper, you mentioned that "Suppose the reconstruction network has T initial reconstructions and T final reconstructions, we have 2T objectives to minimize." Does that mean the "initial reconstruction network" and "deep reconstruction network" are trained with MSE loss separately, although the input of "deep" is the output of "initial"?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or unsubscribe.

stephenllh commented 3 years ago

Then does the gradient of the "deep reconstruction network" flow to the "initial reconstruction block"?

wzhshi commented 3 years ago

Yes. It accumulates the gradient of the deep network and the gradient of the loss function of the initial reconstruction network when optimizing the sampling network and the initial reconstruction network.​

2021-03-30 15:32:19 "Stephen Lau" @.***> 写道:

Then does the gradient of the "deep" flow to the "initial"?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

stephenllh commented 3 years ago

Thank you for taking the time to reply. It is very helpful.