ai-forever / ner-bert

BERT-NER (nert-bert) with google bert https://github.com/google-research.
MIT License
407 stars 97 forks source link

This model use a lots of memory #5

Closed EricAugust closed 5 years ago

EricAugust commented 5 years ago

Beside bert model, after I train my model. Load bert and trained model, also data, it need 4.5G, this is very large. And during this situation, very hard to deploy online. So is there anyway to reduce memory use?

king-menin commented 5 years ago

reduce number of your parametres ))

king-menin commented 5 years ago

only BERT embedder need about 2 G gpu memory (in eval mode, without any additional layers on top).

EricAugust commented 5 years ago

I only add one lstm layer with 256 hiden states and a crf layer.

PaulZhangIsing commented 5 years ago

it is the reality. Otherwise you have to reduce the parameters