AlexGidiotis / Document-Classifier-LSTM

A bidirectional LSTM with attention for multiclass/multilabel text classification.
MIT License
171 stars 52 forks source link

Regarding better accuracy with BILSTM Attention #3

Closed spartian closed 5 years ago

spartian commented 5 years ago

You mentioned in your readme file that you achieved better accuracy with BILSTM attention instead of HAN. What did you mean by BILSTM attention? I want to know the architecture. Are you referring to any specific research paper? How is the architecture different from HAN?

AlexGidiotis commented 5 years ago

The architecture is in the classifier.py file. It is a simple blstm layer with attention on top. There might be a number of reasons why HAN does not perform as well. I suspect the more complex architecture might be overfitting slightly.

spartian commented 5 years ago

Thank you for the answer...I will look into classifier.py file.