robertmaxwilliams / talking-statues

for soft eng class at school. Beware the jungabunga.
6 stars 2 forks source link

Handle Requests Asynchronously #15

Open bionboy opened 4 years ago

bionboy commented 4 years ago

Right now Flask handles each request sequentially. Meaning that everyone but the first person in the queue will experience longer wait times. Meaning only one google assistant user at a time...

Either we need to thread our Flask server (maybe with celery) or switch to another language (Node.js, ASP.net, Ruby @rowantone).

Celery: https://stackoverflow.com/questions/31866796/making-an-asynchronous-task-in-flask

Thoughts?

bionboy commented 4 years ago

Shit. Can gpt2_simple even run more than one prediction at a time? Do we need to run multiple models in that case?