guanlinchao / bert-dst

BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer
101 stars 45 forks source link

BERT-DST

Contact: Guan-Lin Chao (guanlinchao@cmu.edu)

Source code of our paper BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer (Interspeech 2019).

@inproceedings{chao2019bert,
title={{BERT-DST}: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer},
author={Chao, Guan-Lin and Lane, Ian},
booktitle={INTERSPEECH},
year={2019}
}

Tested on Python 3.6, Tensorflow==1.13.0rc0

Required packages (no need to install, just provide the paths in code):

  1. bert
  2. uncased_L-12_H-768_A-12: pretrained [BERT-Base, Uncased] model checkpoint. Download link in bert.

Datasets:

dstc2-clean, woz_2.0, sim-M and sim-R