Closed doheon114 closed 2 days ago
This loss works for binary and multilabel classification (predicting multiple independent binary variables at once). This shouldn't be confused with cross entropy losses, which perform multiclass classification (predicting which class a sample falls into).
If you're still confused, this post may provide further clarification: https://discuss.pytorch.org/t/using-bcewithlogisloss-for-multi-label-classification/67011
Ah, so you're performing binary classification independently for each label. Thank you!
Hello, I'm finding the GitHub repository you provided very useful. However, I have a question. I noticed that you've used the criterion binary_cross_entropy_with_logits quite often, but as far as I know, this criterion is generally used only for binary classification. However, in your code, it works even when num_labels is set to 2 or more. Could you explain how this works? Thank you!