Open Samsagax opened 3 years ago
I was able to reproduce this but I'm confused why this is happening.
At first I thought it might reset the limit between chunks but I don't see anything in the code that would reset it there.
So I tried to set CURLOPT_MAX_RECV_SPEED_LARGE
just before calling curl_easy_perform in Downloader::processGalaxyDownloadQueue but the limit is still ignored.
Seems like curl had a similar problem here when reusing a handler. Maybe you should look into it or maybe is the way you are using it. The use case there reuses the handler but in series, not in parallel. I imagine you are using the same handler for all threads and maybe you should use a handler per thread instead, you could reuse it for each chunk inside each thread. I've just built the last version of libcurl which has the proposed fix but the same issue in lgogdownloader is present.
It's weird. After a few more tests and looking at the code, nothing seems wrong. Each file chunk is handled as it where an individual file (by printing the url right after being read from json file) so the handler is recreated for each thread and reused inside it each time a chunk is downloaded.
@Sude- maybe you should check with curl developers this use case if it is an upstream bug. Will try to investigate further.
I'm using a command like this one to download a heavy game:
And I noticed the download is set to respect that limit for the first set of threads:
Then When the first thread is done (a file chunk is finished), the next one won't respect the limit and every time a new thread is spawned it unsets the limit and goes to maximum speed using all available bandwith.
Seems that the curl handler is not set again to use the parameter
CURLOPT_MAX_RECV_SPEED_LARGE
and the default (unlimited) value is used.