bst-mug / n2c2

Support code for participation at the 2018 n2c2 Shared-Task Track 1
https://n2c2.dbmi.hms.harvard.edu
Apache License 2.0
6 stars 4 forks source link

Investigate freezing embeddings layer #112

Closed michelole closed 5 years ago

michelole commented 5 years ago

Both on fasttext and dl4j.

michelole commented 5 years ago

Our dl4j-based code seems to "freeze" them already, because the input is featurized by ourselves and is not subject to training.

For fasttext, follow https://github.com/facebookresearch/fastText/issues/811.

michelole commented 5 years ago

In #101 we saw that pre-trained embeddings show a difference with fewer epochs, so there's no need to freeze them now.