Hey,
As for the tensors y_true and y_pred with a shape of [1, 256, 256, 1],
the class_loglossesis a tensor with a shape of [1] so that class_weights here cannot do anythingwhen executing the following line weighted_bce = K.sum(class_loglosses * K.constant(class_weights))
I wonder if there's a confusion for the weighted loss? Or it can be replaced with the K.weighted_binary_crossentropy within pos_weight for the positive samples? Any advice?
Hey, As for the tensors
y_true
andy_pred
with a shape of[1, 256, 256, 1]
,the
class_loglosses
is a tensor with a shape of [1] so thatclass_weights here cannot do anything
when executing the following lineweighted_bce = K.sum(class_loglosses * K.constant(class_weights))
I wonder if there's a confusion for the weighted loss? Or it can be replaced with the
K.weighted_binary_crossentropy
withinpos_weight
for the positive samples? Any advice?