I am wondering how to use sparse_categorical_crossentropy loss function and how to set the loss_shape parameter properly?
My model is defined as follows:
input_shape = [128, 128]
nb_labels=3
def get_model(input_shape=input_shape, nb_labels=3):
nb_rows, nb_cols = input_shape
inputs = Input((nb_rows, nb_cols, 3))
...
...
...
conv10 = Conv2D(nb_labels, (1, 1), activation='linear')(conv9_2)
x = Reshape((nb_rows * nb_cols, nb_labels))(conv10)
x = Activation('softmax')(x)
outputs = x
model = Model(inputs=inputs, outputs=outputs)
return model
In the main function, I set:
loss_fn = binary_crossentropy_with_logits
#loss_shape = (target_size[0] * target_size[1] * nb_labels,) # does not work!?
loss_shape = (target_size[0] * target_size[1], 1) # does work
Is that setting correct? Why does loss_shape = (target_size[0] * target_size[1] * nb_labels,) does not work? Usually, when I use the sparse_categorical_crossentropy loss, I do the following for converting the masks to one-hot-encoding: y_train = y_train.reshape((-1, 1)) and I was not sure where to add this in the SegDataGenerator. Any help is appreciated.
I am wondering how to use
sparse_categorical_crossentropy
loss function and how to set theloss_shape
parameter properly?My model is defined as follows:
In the main function, I set:
Is that setting correct? Why does
loss_shape = (target_size[0] * target_size[1] * nb_labels,)
does not work? Usually, when I use thesparse_categorical_crossentropy
loss, I do the following for converting the masks to one-hot-encoding:y_train = y_train.reshape((-1, 1))
and I was not sure where to add this in theSegDataGenerator
. Any help is appreciated.