wzh9969 / HPT

This repository implements a prompt tuning model for hierarchical text classification. This work has been accepted as the long paper "HPT: Hierarchy-aware Prompt Tuning for Hierarchical Text Classification" in EMNLP 2022.
MIT License
63 stars 7 forks source link

Request for code source for BERT+Hard Prompt / soft prompt #6

Closed OmkarModi closed 1 year ago

OmkarModi commented 1 year ago

Hello, I am carrying out few experiments on extreme multilabel classification. It would be really helpful if you could provide me a source code or detailed idea about HTC is carried out as flat classification task.

Thankyou

wzh9969 commented 1 year ago

These two baselines can be easily implemented with our codes. Modify how the template forms and the desired output labels in lines 111-138 of train.py as well as the loss function in line 210 of models/prompt.py. Then input --graph "" as an extra argument for training to remove the graph encoder. After removing everything about hierarchy in HTC, the problem is then flat classification. The idea of applying prompt methods to multi-label classification is similar to simple multi-class classification except we do a multi-label prediction at the [MASK] spot.

OmkarModi commented 1 year ago

Your prompt response and assistance were greatly appreciated, and I'm very thankful for your help!