Open GuokaiLiu opened 3 years ago
Hi, thanks for you interest in our work.
Let me know if you have any other questions.
Thanks for your timely response : )
For example, the p=max(sofmax(logits))
and the threshold is v
, if 'p<v', we let a human expert to intervene? Is that right?
Thanks for share this reference. I copy the contents as follows for those who may have the same question. Not sure if this metric is debatable? It seems that all in-domain examples should achieve higher output values after sotmax layer than those of out-of-domain examples. However, this assumption may not true in practice.
Detection accuracy. This metric corresponds to the maximum classification probability over all possible thresholds δ: 1 − minδ {Pin (q (x) ≤ δ) P (x is from Pin) + Pout (q (x) > δ) P (x is from Pout) }, where q(x) is a confident score such as a maxi- mum value of softmax. We assume that both positive and negative examples have equal probability of appearing in the test set, i.e., P (x is from Pin) = P (x is from Pout) = 0.5.
I just figured it out in the code. 👍
Thank you : )
Hi Lingkai
Thanks for sharing the great code for the fantastic paper. I want to consult some questions if possible.
How can we do it after training SDE-Net for estimating uncertain examples, especially for unseen classes?
For 'mis' task, the detection error is easy to understand. For 'OOD' task, a trained SDE-net outputs the softmax values (without ground-truth information) for both in-domain examples and OOD examples. I'm confused about the detection accuracy of OOD task. Any math expression or description for OOD detection accuracy?
In Figure 3, both ID data and OOD data runs through f-net and g-net, merges with each other and finally outputs the predictions. It makes sense in the training process. However, the g-net seems useless in the test process? If we want to let the model evaluate its confidence/uncertainty for a specific example, should we use the g-net and set a threshold?
Some other potential questions may come later. Sincerely thanks for your kindly help : )