Closed alqurri77 closed 2 years ago
Hi,
Thanks for the acknowledgement.
There is nothing wrong here, In TensorFlow/Keras we can apply multiple loss functions to multiple outputs.
Basically what happens here is that each loss function(loss=[iou_loss, bce_loss, ssim_loss] line 288) is gonna be applied to each model output.
Search how can we apply multiple loss functions to multiple outputs, then you can understand the flow.
Thank you.... just more question ... is there a difference between this and averaging the loss in the model like this:
nestnet_output_all = keras.layers.Average()([d1, d2, d3, d4, e5])
Assuming d1 is the loss calculated on stage 1 and so on... This is still different because in Keras it takes the sum of all loss functions, you are taking average so that will be different.
Thank you ... but does it practically makes a difference, because in both cases it will just minimize the output.
It does, it will change the magnitude of loss value which then propagated into network using back propagation. That's why researcher use different weight's value to increase or decrease the effects of some particular loss value.
Hi;
First, thank you very much for implementing this in Tensorflow. I just have a confusion, I notice the return for the model have 7 outputs as expected:
model = models.Model(inputs=[x_in], outputs=[d_stage_1, d_stage_2, d_stage_3, d_stage_4, d_stage_5, d_stage_6, bridge])
However, the loss have input for 2, I was expecting also 7:def ssim_loss(y_true, y_pred):
Because in Basenet implementation in Pytorch:
I think I'm missing something here.