Open mtt-artis opened 1 year ago
Thank you for bringing this up and describing the problem in great detail. While I understand the problem, I think that such a concurrency limit should rather be implemented by the user. As you mentioned, one tus upload has a maximum of 1 concurrent request (unless you are using the parallelUploads
option). Therefore, one can control the number concurrent requests by managing how many uploads are started in parallel. Such a limit can easily be implemented using a queue or semaphore. The advantage of implementing a request limit outside of tus-js-client is that the user receives more feedback about the upload's state. For example, when an upload is not started because the concurrency limit is reached, the UI can be updated to reflect this fact. When we implement a limit in tus-js-client the user would just see an upload that is not progressing.
Furthermore, if you actually want to limit the number of requests (and not the number of uploads), you can implement this right now using the onBeforeRequest
option. It can also return a Promise and by resolving that Promise whenever the concurrency limit allows another request, you can implement the desired functionality.
All in all, I think that there are better alternatives than having this functionality in tus-js-client.
hi @Acconut, thanks for the reply.
As you mentioned, one tus upload has a maximum of 1 concurrent request (unless you are using the parallelUploads option).
I think I was fooled by the dev tools where I can see 4 pending requests at once for one upload.
const upload = new Upload(file, {
endpoint: "/api/files",
chunkSize: 1_000_000,
retryDelays: [0, 1000, 3000, 5000],
metadata: {
filename: file.name,
filetype: file.type,
lastModified: file.lastModified.toString(),
},
};
upload.start();
It can also return a Promise and by resolving that Promise whenever the concurrency limit allows another request.
do you have a repository with such functionality to show me?
I think I was fooled by the dev tools where I can see 4 pending requests at once for one upload.
If you call upload.start()
once then there should only ever be one PATCH request. Any other behavior would indicate a bug. Can you reproduce the multiple PATCH requests? Do you call start
multiple time?
do you have a repository with such functionality to show me?
No, unfortunately I do not have an example showcasing this.
Is your feature request related to a problem? Please describe. In various use cases, I have encountered a challenge with multiple concurrent uploads initiated by the
tus-js-client
library. This issue has resulted in suboptimal network resource management, leading to inefficient bandwidth allocation and adversely affecting the performance of other critical network operations.Describe the solution you'd like I would like to propose the introduction of a global maximum concurrent request feature within the
tus-js-client
library. This feature would allow to set an upper limit on the number of concurrent uploads permitted across all instances oftus.Upload
within a single application. This feature would ensure that the library operates within my desired bandwidth constraints, preventing network congestion, and enhancing overall network performance.Describe alternatives you've considered An alternative approach is to upload files one at a time, initiating the next upload using the onError and onSuccess hooks. This ensures controlled, sequential uploads, preventing network congestion and providing predictable resource management.
Additional context Applications must guarantee consistent, shared network quality of service. By preventing network congestion, it ensures an equitable distribution of resources, safeguarding the performance of both uploads and other vital network operations. I would greatly appreciate the inclusion of this feature within the
tus-js-client
library.