Closed paul-tqh-nguyen closed 4 years ago
While over eagerly accomplishing https://github.com/paul-tqh-nguyen/reuters_topic_labelling/issues/3, we implemented attention layers.
That might be too much. I don't think it's necessary for this dataset.
Thus, we should avoid any unnecessary complications.
Let's get rid of it.
https://github.com/paul-tqh-nguyen/reuters_topic_labelling/commit/23c06791eb51d424cc561ad486fa33a44e5d5caf completes the removal of the attention mechanism, so I'll close this task for now.
While over eagerly accomplishing https://github.com/paul-tqh-nguyen/reuters_topic_labelling/issues/3, we implemented attention layers.
That might be too much. I don't think it's necessary for this dataset.
Thus, we should avoid any unnecessary complications.
Let's get rid of it.