PyTorch implementation of the paper Dialogue Act Classification with Context-Aware Self-Attention for dialogue act classification with a generic dataset class and PyTorch-Lightning trainer. This implementation has following differences compare to the actual paper
data/
using: cd data/
unzip switchboard.zip
cd ..
line 15-20 in main.py
) and don't pass it to Lightning trainer (line 32 in main.py
), and then comment the logging code in Trainer.py (line 70 and 95)
. By default Lightning will log to tensorboard logger.batch_size, lr, epochs etc
) in config.py
.python main.py
data/
, your dataset should have following structure
config.py line 18
according to your dataset.Note: Feel free to create to an issue if you find any problem. Also you're welcome to create PR if you want to add something. Here is the list of components one can add:
[1]: Raheja, V., & Tetreault, J. (2019). Dialogue Act Classification with Context-Aware Self-Attention. ArXiv, abs/1904.02594.
[2]: Lin, Z., Feng, M., Santos, C.D., Yu, M., Xiang, B., Zhou, B., & Bengio, Y. (2017). A Structured Self-attentive Sentence Embedding. ArXiv, abs/1703.03130.
[3]: Switchboard Dialogue Act corpus: http://compprag.christopherpotts.net/swda.html