wzh9969 / HPT

This repository implements a prompt tuning model for hierarchical text classification. This work has been accepted as the long paper "HPT: Hierarchy-aware Prompt Tuning for Hierarchical Text Classification" in EMNLP 2022.
MIT License
62 stars 7 forks source link

Problem #4

Closed wjczf123 closed 1 year ago

wjczf123 commented 1 year ago

Hi, zihan.

I have a question about this paper. For verbalizer, you create a learnable virtual label word for each label. In this way, this method is very much like a traditional MLM task. Is my understanding correct?

wzh9969 commented 1 year ago

Yes, it's correct. The key idea of prompt is to form downstream tasks as the pretraining task, which is MLM in the case of BERT.

wjczf123 commented 1 year ago

Thank you for your reply.