ShannonAI / service-streamer

Boosting your Web Services of Deep Learning Applications.
Apache License 2.0
1.22k stars 187 forks source link

Question about System architect and design #70

Closed VJAYSLN closed 4 years ago

VJAYSLN commented 4 years ago

Thanks for Great Work :100: :) I have Few doubts about system design.

i) Since, Service Streamer Used Multiprocessing Queue, the queue size is not fixed one. So what happens, when having more number of concurrent users sending requests at a time(say 2000 users sending request at the same time)?

ii) when an application is CPU bounded one, then model prediction took some more time to predict the output. How service streamer handling concurrent requests for that kinda application?

iii) When an application is crashed in prediction model, service streamer sending requests to the input queue without getting any response. Is there any way to stop the requests?

For all above mentioned points, the queue size was increased continuously, This might leads to the memory problem. Is there any work around for this?

Thanks in Advance :)

Meteorix commented 4 years ago

Thanks for Great Work 💯 :) I have Few doubts about system design.

i) Since, Service Streamer Used Multiprocessing Queue, the queue size is not fixed one. So what happens, when having more number of concurrent users sending requests at a time(say 2000 users sending request at the same time)?

ii) when an application is CPU bounded one, then model prediction took some more time to predict the output. How service streamer handling concurrent requests for that kinda application?

iii) When an application is crashed in prediction model, service streamer sending requests to the input queue without getting any response. Is there any way to stop the requests?

For all above mentioned points, the queue size was increased continuously, This might leads to the memory problem. Is there any work around for this?

Thanks in Advance :)

Thanks for your interest. i) You are on your own to do rate limit. ii) For cpu bounded applications, you can use multi-process mode or even multiple machines, and use redis as message queue. iii) If one consumer(process) crashes, other consumers can still do the job.

VJAYSLN commented 4 years ago

Thanks for your helpful suggestions.