AmineDiro / daskqueue

Distributed Task Queue based Dask
MIT License
35 stars 0 forks source link

Suggestion #16

Closed NonlinearNimesh closed 1 year ago

NonlinearNimesh commented 1 year ago

Hi, I have a react based webapp integrated with Django as a backend, the current challenge we are facing in file uploads like we are expecting 3-5lac file upload of average size say 5Mb each file. Currently we are implemented using celery and redis but it is not efficient to handle this much request. I wanted to know if dask queue can handle this efficiently. Can anyone please help me on this.

AmineDiro commented 1 year ago

Hi @NonlinearNimesh , As I understand you have a file upload endpoint with a large number of requests of files ~5Mb. Are those files processed in a backend service using celery workers? Or do you want to scale the upload?

Dask can be used as a computational engine similar to Celery, but while still maintaining a high degree of concurrency and not blocking needlessly. You can start a dask cluster with queuepool and consumerpool and connect to it using a dask async client in your routes. Take a look at this: https://distributed.dask.org/en/stable/asynchronous.html

NonlinearNimesh commented 1 year ago

Thanks for the response @AmineDiro

  1. Are those files processed in a backend service using celery workers ==> Yes, we take the file and return to the front end that "file upload successfully" and then there are 8-9 task (which include DB operation, conversion, etc.) that is done in the backend. We are using 4 workers but I have seen that queue getting blocked the moment it receives a file over than a threshold (currently I don't know the exact number of file uploads). So i am guessing this is also a scaling problem. So to answer your 2nd Question, Yes we want to scale the upload also.

Also, one more problem that exists is let's say I gave a request of 500 files to upload and you have given 5 after me. So you have to wait till my 500 uploads are finished. This cant happen. Do you think scaling can solve this problem?

AmineDiro commented 1 year ago

Ok I see, queues are used to offload some processing-heavy workload in the background and return a waiting result for the user. In my opinion, uploading files doesn't fit this pattern. You should block the user until the whole file is uploaded. You should just horizontally scale your uploading API and use multi-part upload with async capability. Hope this is useful 😸 !