thuiar / OKD-Reading-List

Papers for Open Knowledge Discovery
https://github.com/thuiar/OKD-Reading-List
BSD 3-Clause "New" or "Revised" License
117 stars 24 forks source link

What is the difference between Open Intent Detection and Out-of-domain Detection? #11

Closed wjczf123 closed 2 years ago

wjczf123 commented 2 years ago

Nice work. I have a question about the difference between Open Intent Detection and Out-of-domain Detection? It seems that some work[1-4] in Out-of-domain Detection very related to Open Intent Detection.

There are also two works published in TASLP related to this repository [5-6].

  1. GOLD: Improving Out-of-Scope Detection in Dialogues using Data Augmentation
  2. Energy-based Unknown Intent Detection with Data Manipulation
  3. OutFlip: Generating Out-of-Domain Samples for Unknown Intent Detection with Natural Language Attack
  4. Modeling Discriminative Representations for Out-of-Domain Detection with Supervised Contrastive Learning
  5. Learning to Classify Open Intent via Soft Labeling and Manifold Mixup
  6. Towards Textual Out-of-Domain Detection without In-Domain Labels
topDreamer commented 2 years ago

I have the same doubts, and I hope to get an answer and reply from the author. Thanks a lot~

topDreamer commented 2 years ago

@HanleiZhang

HanleiZhang commented 2 years ago

Hi, first very sorry about my replying lateness. That's a very good question. Indeed, there are some differences between out-of-domain detection and open intent detection in some aspects: (1) Out-of-domain (OOD) detection focuses on detecting if an example is misclassified as out-of-distribution [1]. Thus, it is essentially a binary classification task. However, open intent detection needs to distinguish specific known classes of in-domain (ID) samples. (2) The evaluation metrics of OOD detection and open intent detection are rather different. The former usually uses AUROC, AUPR, and FPR metrics to evaluate the performance of a binary classifier, while the latter needs to evaluate both the known class and open class performance. (3) Many OOD detection methods need OOD samples during training. Some use labeled OOD data [1, 2], while others use pseudo-generated/constructed OOD data during training [3, 4, 5]. Open intent detection does not use labeled OOD data and may use pseudo OOD data during training. The paper [6] you mentioned belongs to open intent detection. Thanks for your paper recommendations [5, 6].

Finally, we would like to recommend you read our latest paper Learning Discriminative Representations and Decision Boundaries for Open Intent Detection. This paper stresses the difference between ood detection and open intent detection. Enjoy it!

[1] A Baseline for Detecting Misclassified And Out-of-Distribution Examples in Neural Networks.
[2] GOLD: Improving Out-of-Scope Detection in Dialogues using Data Augmentation. [3] OutFlip: Generating Out-of-Domain Samples for Unknown Intent Detection with Natural Language Attack.
[4] Modeling Discriminative Representations for Out-of-Domain Detection with Supervised Contrastive Learning.
[5] Towards Textual Out-of-Domain Detection without In-Domain Labels.
[6] Learning to Classify Open Intent via Soft Labeling and Manifold Mixup.

wjczf123 commented 2 years ago

Thanks for your reply. I get it.