Sude- / lgogdownloader

LGOGDownloader is unofficial downloader to GOG.com for Linux users. It uses the same API as the official GOG Galaxy.
https://sites.google.com/site/gogdownloader/
Do What The F*ck You Want To Public License
721 stars 67 forks source link

`--limit-rate` only works for the first set of threads #195

Open Samsagax opened 3 years ago

Samsagax commented 3 years ago

I'm using a command like this one to download a heavy game:

$ lgogdownloader --galaxy-install cyberpunk_2077 --threads 6 --limit-rate 200

And I noticed the download is set to respect that limit for the first set of threads:

#0 ./1423049311/archive/pc/content/audio_1_general.archive (chunk 1/18)
 12% ▕███▏                      ▏ 1.16/9.38MB @ 200.07kB/s ETA: 42s
#1 ./1423049311/archive/pc/content/audio_2_soundbanks.archive (chunk 1/495)
 13% ▕███▎                      ▏ 1.27/9.90MB @ 200.03kB/s ETA: 44s
#2 ./1423049311/archive/pc/content/basegame_1_engine.archive (chunk 1/19)
 12% ▕███▏                      ▏ 1.17/9.67MB @ 200.00kB/s ETA: 43s
#3 ./1423049311/archive/pc/content/basegame_3_nightcity_terrain.archive (chunk 1/309)
  6% ▕█▍                        ▏ 0.56/9.94MB @ 200.14kB/s ETA: 47s
#4 ./1423049311/archive/pc/content/basegame_3_nightcity.archive (chunk 1/771)
 13% ▕███▎                      ▏ 1.28/9.97MB @ 200.00kB/s ETA: 44s
#5 ./1423049311/archive/pc/content/basegame_3_nightcity_gi.archive (chunk 1/1076)
 12% ▕███                       ▏ 1.17/10.00MB @ 200.07kB/s ETA: 45s
Total: 1.17MB/s | Remaining: 109 (31.89GB) ETA: 7h 48m 39s

Then When the first thread is done (a file chunk is finished), the next one won't respect the limit and every time a new thread is spawned it unsets the limit and goes to maximum speed using all available bandwith.

#0 ./1423049311/archive/pc/content/audio_1_general.archive (chunk 3/18)
  2% ▕▍                         ▏ 0.17/9.44MB @ 176.18kB/s ETA: 53s
#1 ./1423049311/archive/pc/content/audio_2_soundbanks.archive (chunk 2/495)
 61% ▕███████████████▉          ▏ 6.09/9.98MB @ 2.47MB/s ETA: 1s
#2 ./1423049311/archive/pc/content/basegame_1_engine.archive (chunk 3/19)
  0% ▕                          ▏ 0.00/0.00MB @ 0.00kB/s ETA: 0s
#3 ./1423049311/archive/pc/content/basegame_3_nightcity_terrain.archive (chunk 1/309)
 98% ▕█████████████████████████▍▏ 9.72/9.94MB @ 198.00kB/s ETA: 1s
#4 ./1423049311/archive/pc/content/basegame_3_nightcity.archive (chunk 2/771)
 16% ▕████▎                     ▏ 1.62/9.89MB @ 1001.81kB/s ETA: 8s
#5 ./1423049311/archive/pc/content/basegame_3_nightcity_gi.archive (chunk 2/1076)
  3% ▕▋                         ▏ 0.27/10.00MB @ 272.00kB/s ETA: 36s
Total: 4.08MB/s | Remaining: 109 (31.89GB) ETA: 2h 15m 10s

Seems that the curl handler is not set again to use the parameter CURLOPT_MAX_RECV_SPEED_LARGE and the default (unlimited) value is used.

Sude- commented 3 years ago

I was able to reproduce this but I'm confused why this is happening. At first I thought it might reset the limit between chunks but I don't see anything in the code that would reset it there. So I tried to set CURLOPT_MAX_RECV_SPEED_LARGE just before calling curl_easy_perform in Downloader::processGalaxyDownloadQueue but the limit is still ignored.

Samsagax commented 3 years ago

Seems like curl had a similar problem here when reusing a handler. Maybe you should look into it or maybe is the way you are using it. The use case there reuses the handler but in series, not in parallel. I imagine you are using the same handler for all threads and maybe you should use a handler per thread instead, you could reuse it for each chunk inside each thread. I've just built the last version of libcurl which has the proposed fix but the same issue in lgogdownloader is present.

Samsagax commented 3 years ago

It's weird. After a few more tests and looking at the code, nothing seems wrong. Each file chunk is handled as it where an individual file (by printing the url right after being read from json file) so the handler is recreated for each thread and reused inside it each time a chunk is downloaded.

@Sude- maybe you should check with curl developers this use case if it is an upstream bug. Will try to investigate further.