Open bugface opened 4 years ago
Adding the LSTM-attention implementation. we need to merge the LSTM and LSTM attention into a unified model.
We also need to create common task.py and utils for all the models since we need to run all models on the same datasets.
https://www.ijcai.org/Proceedings/2019/607
Adding the LSTM-attention implementation. we need to merge the LSTM and LSTM attention into a unified model.
We also need to create common task.py and utils for all the models since we need to run all models on the same datasets.