Closed NonlinearNimesh closed 1 year ago
Hi @NonlinearNimesh , As I understand you have a file upload endpoint with a large number of requests of files ~5Mb. Are those files processed in a backend service using celery workers? Or do you want to scale the upload?
Dask can be used as a computational engine similar to Celery, but while still maintaining a high degree of concurrency and not blocking needlessly. You can start a dask cluster with queuepool and consumerpool and connect to it using a dask async client in your routes. Take a look at this: https://distributed.dask.org/en/stable/asynchronous.html
Thanks for the response @AmineDiro
Also, one more problem that exists is let's say I gave a request of 500 files to upload and you have given 5 after me. So you have to wait till my 500 uploads are finished. This cant happen. Do you think scaling can solve this problem?
Ok I see, queues are used to offload some processing-heavy workload in the background and return a waiting result for the user. In my opinion, uploading files doesn't fit this pattern. You should block the user until the whole file is uploaded. You should just horizontally scale your uploading API and use multi-part upload with async capability. Hope this is useful 😸 !
Hi, I have a react based webapp integrated with Django as a backend, the current challenge we are facing in file uploads like we are expecting 3-5lac file upload of average size say 5Mb each file. Currently we are implemented using celery and redis but it is not efficient to handle this much request. I wanted to know if dask queue can handle this efficiently. Can anyone please help me on this.