dmmiller612 / bert-extractive-summarizer

Easy to use extractive text summarization with BERT
MIT License
1.4k stars 308 forks source link

server configuration? #12

Closed predestination closed 5 years ago

predestination commented 5 years ago

what should be the server configuration, for running model on AWS? it is taking local storage, and the process is getting killed, giving 500 internal server? we have 12 gigs of ram on aws and 12GB SSD

dmmiller612 commented 5 years ago

I had to use a larger instance with this to run it on a server in aws. For demo purposes, I think I used a m5.xlarge. Predictions run slower with these instances though, and obviously a c5 instance would be better, but they are a bit more costly.

dmmiller612 commented 5 years ago

I recently moved, and I am slowly getting more time back. I'll get something up with the Docker file to run a simple service.

dmmiller612 commented 5 years ago

Oh! and one last thing. Make sure it isn't related to spacy versions. Right now, the neural coref library doesn't support spacy > 2.1.3

dmmiller612 commented 5 years ago

I provided an example Docker flask service. The readme explains how to run the service.