Closed ljw919 closed 1 year ago
I want to know that, is your your label mask only contains 0 or 1 ? like a one-hot encode ? And can we integrate crossentropy loss into the model ?
so that through the reverse diffusion process, we can get a probability mask (values 0-1), then we can use the torch.argmax to get the label ?
yes label mask only contains 0 and 1.
I want to know that, is your your label mask only contains 0 or 1 ? like a one-hot encode ? And can we integrate crossentropy loss into the model ?