uf-hobi-informatics-lab / seqEHR

develop deep learning-based models for learning structured EHR as a sequence
MIT License
5 stars 0 forks source link

add LSTM attention implementation #2

Open bugface opened 4 years ago

bugface commented 4 years ago

Adding the LSTM-attention implementation. we need to merge the LSTM and LSTM attention into a unified model.

We also need to create common task.py and utils for all the models since we need to run all models on the same datasets.

bugface commented 4 years ago

https://www.ijcai.org/Proceedings/2019/607