codes for our paper A Multi-task Learning Model for Chinese-oriented Aspect Polarity Classification and Aspect Term Extraction
LCF-ATEPC,面向中文及多语言的ATE和APC联合学习模型,基于PyTorch和pytorch-transformers.
LCF-ATEPC, a multi-task learning model for Chinese and multilingual-oriented ATE and APC task, based on PyTorch
LCF-ATEPC模型进行方面抽取与情感预测的用法请见这里。
Check the detailed usages in ATE examples directory.
use_bert_spc = True
to improve the APC performance while only APC is considered.We use the configuration file to manage experiments setting.
Training in batches by experiments configuration file, refer to the experiments.json to manage experiments.
Then,
python train.py --config_path experiments.json
If you want to build your dataset, please find the description of the dataset here
Since BERT models require a lot of memory. If the out-of-memory problem while training the model, here are the ways to mitigate the problem:
use_unique_bert = true
to use a unique BERT layer to model for both local and global contextsWe made our efforts to make our benchmarks reproducible. However, the performance of the LCF-ATEPC models fluctuates and any slight changes in the model structure could also influence performance. Try different random seed to achieve optimal results.
We cleaned up and refactored the original codes for easy understanding and reproduction. However, we didn't test all the training situations for the refactored codes. If you find any issue in this repo, You can raise an issue or submit a pull request, whichever is more convenient for you.
Due to the busy schedule, some module may not update for long term, such as saving and loading module for trained models, inferring module, etc. If possible, we sincerely request for someone to accomplish these work.
If this repository is helpful to you, please cite our paper:
@misc{yang2019multitask,
title={A Multi-task Learning Model for Chinese-oriented Aspect Polarity Classification and Aspect Term Extraction},
author={Heng Yang and Biqing Zeng and JianHao Yang and Youwei Song and Ruyang Xu},
year={2019},
eprint={1912.07976},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
MIT License