WenZhihao666 / G2P2

Augmenting Low-Resource Text Classification with Graph-Grounded Pre-training and Prompting. SIGIR-2023
MIT License
20 stars 2 forks source link

Augmenting Low-Resource Text Classification with Graph-Grounded Pre-training and Prompting

We provide the implementation of G2P2 model, which is the source code for the SIGIR 2023 paper "Augmenting Low-Resource Text Classification with Graph-Grounded Pre-training and Prompting".

The repository is organised as follows:

For pre-train:

On Cora dataset,

python main_train.py 

If on Amazon datasets, it should be:

python main_train_amazon.py

For prompt tuning and testing:

On Cora dataset,

python main_test.py 

If on Amazon datasets, it should be:

python main_test_amazon.py

For zero-shot testing:

First, change directory to /zero-shot

On Cora dataset,

python zero-shot-cora.py 

If on Amazon datasets, it should be:

python zero-shot-amazon.py

Cite

@inproceedings{DBLP:conf/sigir/Wen023,
  author       = {Zhihao Wen and
                  Yuan Fang},
  title        = {Augmenting Low-Resource Text Classification with Graph-Grounded Pre-training
                  and Prompting},
  booktitle    = {Proceedings of the 46th International {ACM} {SIGIR} Conference on
                  Research and Development in Information Retrieval, {SIGIR} 2023, Taipei,
                  Taiwan, July 23-27, 2023},
  pages        = {506--516}
}