zseder / hunvec

Sequential Tagging in NLP using neural networks
5 stars 4 forks source link

dropout not working when model is reloaded and training is continued #61

Closed zseder closed 9 years ago

zseder commented 9 years ago

in sequence_tagger.py, dropout_fprop() uses self.hdims, but that parameter isn't saved with the model (see __get_state__), so it fails. Instead of checking hdims, a check on self.layers would be better of possible. If not, hdims has to go to __get_state__, but hopefully not, it is redundant.

@pajkossy

zseder commented 9 years ago

check self.tagger.layers, that will be easy

pajkossy commented 9 years ago

solved by #62