Open basimhennawi opened 4 years ago
I have faced this issue. Don't know the solution. But I pushed it into Ubuntu 16.04 docker container and used the reduce model. It's worked. Still don't know the reason!
Hi @basimhennawi and @giriannamalai, I'm trying to use multilabel classification as well and wanted to know if you had figured out the problem. Do the class probabilities add up to 1 eveb when you use loss=ova
? Could you share the command you used to train the classifier?
so is this any solutions to solve this issue ?
+1
Hi, I followed Multi-Label Classification documentation from fasttext to apply it on my free text dataset which look like this after processing/labelling:
I set up a notebook instance on AWS SageMaker and train the model. For simplicity, let's say with 5 labels (choice, fast-delivery, good-prices, bad-prices, nothing), the problem is when I predict some text with sitting the (K) to -1 to get all of them, I always get the summation probabilities of labels is equal to 100%, for example:
I expect something like:
and then I could set the threshold to greater than 50% so only 2 labels matches (choice and fast-delivery)
instead I got something like:
which means if the text really matches the 5 labels so much it will return 20% for each, and will be dismissed all by the threshold.
N.B.: in the documentation's example the got the output as expected but by following the docs it's not working like that:
The question is how could I achieve the output as expected? within fasttext or even with some other tool, is there some parameters to change/add?
Thanks in advance!