Closed ssj4maiko closed 1 year ago
That's the annoying part, for some reason (at least in my downloads) it prioritize big files first, which just happen to also be the most prone to fail. That would actually be a great upgrade.
Each of the downloads comes from a different server, there is only ever one download open on a single server, so limiting simultaneous download count would only make things worse. Large files are downloaded first because server download speeds vary a lot and this makes overall completion times more consistent (it's still bad though, but no one else bothered to improve it and I'm busy with other things).
I feel that a drastic solution requires mutual uploading between clients using torrent-like technology instead of ftp, but it seems that no one can handle Sagrada Familia any longer.
basically when we're talking about downloading speed faster,we should learn sth to IDM and Bittorrent. The idea is no other than these two ways: 1.devide the task into smaller parts,and trasfer each data with multiple threads at one time.make the best use of an internet transfering tunnel. 2.use P2P share mode,that is to turn every downloader program of a individual user into a point to point sharing server.(even the microsoft windows update is using this method)
Solved in 0.21.0
I'm aware that the current problems of downloads breaking and so on are likely related to the server.
So I thought about these ideas:
An option to change the number of simultaneous downloads, at least from a config file. This way, although the general download speed will likely lower, we can get something more consistent, and the server should find itself less loaded, since right now each person is loading at least 4 items there... The free server could be going quite hot like this.
And finally, sorting the downloads by size, mainly, downloading the smaller files sooner, and the larger files later. This way we can ensure less errors and repeating overall.