wietsedv / bertje

BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
https://aclanthology.org/2020.findings-emnlp.389/
Apache License 2.0
135 stars 10 forks source link

deploy on serverless architecture #8

Closed flieks closed 4 years ago

flieks commented 4 years ago

Someone experience with deploying this to serverless ? AWS lambda has size restrictions to max of 512MB temp storage which is a problem. image Other options ? Because BERT can be GPU resource heavy if running fulltime

wietsedv commented 4 years ago

I cannot really help you with this. High resource requirements are unfortunately a problem for all large pre-trained language models and I do not have experience with cost-effective deployment.

This seems like a generic problem for AWS lambda / BERT, so I think you will get a higher chance of getting a useful response if you open an issue in the original BERT repo: https://github.com/google-research/bert/issues

flieks commented 4 years ago

Thanks alot. Ok i will ask there