ShaneTian / Att-Induction

Attention-based Induction Networks for Few-Shot Text Classification
Apache License 2.0
45 stars 7 forks source link
bert few-shot-learning multi-head-attention neural-tensor-network text-classification

English | 简体中文

Att-Induction: Attention-based Induction Networks for Few-Shot Text Classification

issues-open issues-closed license

Code for paper [Attention-based Induction Networks for Few-Shot Text Classification]().

Table of Contents

Introduction

Attention-based Induction Networks is a model for few-shot text classification, which continues the work of Induction Networks.

Attention-based Induction Networks can learn different class representations for diverse queries by the multi-head self-attention, in which induction module pays more attention to effective instances and feature dimensions for current query. In addition, we use the pre-trained model instead of training an encoder from scratch, which can capture more semantic information in the few-shot learning scenarios. Experiment results show that, on three public datasets and a real-world dataset, this model significantly outperforms the existing state-of-the-art approaches.

Datasets

Usage

Requirements

You can use pip install -r requirements.txt to install the following dependent packages:

Training

Training scripts are placed in ./scripts/. You only need to modify some training parameters in a shell file, and then run it on the terminal. For example:

bash ./scripts/run_train_HuffPost.sh

You can use python3 train.py -h to see all available parameters.

Test

In fact, if the --test_data is given in the training, the test task will be always performed after training. Of course, you can perform a separate test task by specifying --load_checkpoint and --only_test in the training script.

Maintainers

@ShaneTian.

Citation

License

Apache License 2.0 © ShaneTian