Open error39 opened 8 years ago
Thanks for your kind words on behalf of the current youtube-dl maintainers. As for the feature request, wait for their reply.
Best regards.
I would like to look into this myself, as there's only a limited amount of time maintainers have to look into feature requests like this, I guess. Anyone who can give me a headstart with some direction to which files/functions are actually responsible for handling the actual download of the video or audio file? One more question; Is there a guide/documentation on how to compile the python code yourself on a linux system?
Thanks!
edit: Found the developer notes myself: https://github.com/rg3/youtube-dl/#developer-instructions
A good suggestion. For most cases (including soundcloud), downloading happens in downloader/http.py.
Note that until it gets implemented in the http downloader you may want to try aria2 with --external-downloader aria2c
, it uses multiple connections which could help in your case.
@yan12125 Thanks for your help, I will try to see if/how it's possible to implement this kind of functionality in http.py (I'm a python n00b, but it's always fun learning.. :) )
@jaimeMF It speeds up my download, but it opens multiple connections at the same time, so it's not exactly what I'm after (It increases the server load, which is one reason download accelerators are not so popular among server admins, and chances are they'll try to block those methods if possible) But nevertheless thanks a lot for the valuable tip! It might help others looking for a way to make downloads happen faster.
Something like this would be great! Perhaps an option so that if speed drops below a certain rate, re-start the download from the current point. This happens for me on Youtube. I download on a 400mbps connection and get 3MB/s downloads at times for some videos after the first couple hundred megabytes.
Hi. Seems to me that youtube just started doing this! Noticed today, wasn't present like 2 days ago.
First burst is at full speed (ie. 9.25MiB, not Mbps) then there are 2 bursts of 1.04 MBit/s each lasting 1 sec happening every 2.5 sec (equivalent of an average of 400Kbps/sec) - EDIT: according to nload
.
EDIT2: first burst gets 0.5MiB of the file (or, 1.1% of a 45.78MiB file) - I'm assuming this is true for any file, although this particular one was only the audio part (a .webm) of a video.
Before today, it used to download at full speed constantly (ie. 100Mbps/sec).
EDIT:3: looks like if I restart it 5 times (ie. start youtube-dl, C-c after like 7 sec(ie. let it run just as long as the first half meg gets downloaded at full speed then C-c), repeat 5 times) it unlocks the full speed the 6th time. (same file above got transferred in 1 sec from 75% to 100% after this happened).
EDIT4: I've just retested edit3 on 3 new videos(but only the audio part, because that's what I needed) and there was only one repeat needed: ie. first try got stalled after first burst, I C-c, rerun youtube-dl, then it downloads the remainder at full speed.
EDIT5: Just figured out that --socket-timeout 1
works flawlessly, especially since I've encountered this video which wouldn't decide to unlock full speed even after like 20 restarts! Thank you for this option, btw! Amazing!
\ I say video but I'm only downloading the audio part.
EDIT6: Also need --retries infinite
or it will end after the default of 10
.
However, there is a jump from 60.8% to 100% even though the output seems to be of correct size:
-rw-r--r-- 1 xftroxgpx xftroxgpx 28768063 08.01.2018 11:18 'VERY STABLE GENIUS-u-yLGFuu2dc.opus'
I'll take that to mean that after 34 retries it unlocked full speed.
Note that downloading the video itself, is not as bad: seems to go fullspeed for the first few seconds, then limited to 20Mbit/sec). It's the audio which gets stalled bad.
Partial workaround for automatically bypassing stalls while downloading the audio part from youtube, just add args: --socket-timeout 1 --retries infinite
to youtube-dl (as per my edit5&6 above)
EDIT2: some bash workaround for edit1: $ (exit 1) || while test "$?" -ne "0"; do downaudio https://www.youtube.com/watch?v=rPEU2KJwY6Q; done
EDIT4: alright, I've decided to stop using youtube altogether, replacing it with reading books.
This is not a workaround but my own findings.
I downloaded NewPipe from F-droid and standalone apk Snaptube to test on this slow downloading issue on m4a and dash video 1080p.
Snaptude speed is like youtube-dl, slow.
But Newpipe downloading speed is superb. Not sure why, I couldn't really read any code now as a newbie in programming.
It seems they are systematically throttling all DASH streams, I doubt there can be a solution client side.
Just something worth mentioning. If you load a video or audio directly into an html5 player, then download is instant. I am considering quickly creating a php page/js form to handle the dl and conversion of mp3. I believe it is faster than some other options I was left with.
Even for DASH 1080p+ ?
First i thought my server provider is throttling my bandwidth speed but they told me - "No, we are not" Then i tried with multiple speedtest via command line and i saw that download speed is awesome of my server. Now i started to google about it on web. As i can say know, Youtube.com maybe throttling our requests and slowing it down. I download webm/m4a format and download speed is 80 to 250 kbps right now, however it should be 20 to 40 Mbps. using aria2c is not giving much boost.
Now the question is how can we fix it? Please suggest any idea!
As far as I understand it (don't take my word for it) this is a server side problem. Youtube seems to be throttling everything but the "best" stream. As far as I know, there is not much we can do on the client side, and I hope to be wrong.
The download on Youtube is now very annoying. But I have an idea how to handle this. How about if youtube-dl includes a list of VPN servers that you would define yourself in a config file for the download?
VPN won't solve a thing, this is a server side thing.
Sure it is a server problem. The idea is that youtube-dl fooled the server, the download would be downloaded by different clients, depending on the number of VPN servers that you defined in the config list
I'm seeing the same throttling with audio-only downloads (-f worstaudio
) in the last week. Downloads that previously would take seconds now take upwards of several minutes, regardless of what Internet connection I use. As an earlier commentor suggested, using --external-downloader aria2c
appears to work around this.
Hola Ricardo,
I Wondered if there's a way of implementing an option for circumventing a specific kind of bandwidth throtling?
Let me explain: I noticed quite a few providers, for example soundcloud and mixcloud, will serve the first few megabytes of a stream at maximum speed, but after this the speed drops just to the amount needed for streaming the served media at its bitrate. Like this it takes a long time to download certain media with youtube-dl. How I used to circumvent this is by stopping & resuming the download for example 20 times for a 100mb file, in case they serve the first 5mb unthrottled.
Technically this works very well, but it's not very handy in a practical way. So I created my little helper script where I use youtube-dl to generate the download-url, and let Curl download the actual file, by downloading the first XXkb block, than the second XXkb block, etc... using the range header start & end values, and a while loop.
This works out great for me, but I wonder if it would be very difficult to implement this in the downloader classes of youtube-dl? With an optional switch to activate this 'chunk-downloading', or even the ability for contributors to implement it in the extractors - if the specific service is known for using these 'first part fast' throttling mechanisms - would really benefit the download speed for a lot of users, in my humble opinion :)
So, my questions are: Do you think it's a good idea to implement this kind of functionality and how difficult will it be to do so? If you consider implementing this in the future, I like to help in any way I can to make this possible.
Thanks for creating this great 'little' tool, I use it regularly and am very happy it exists!
Saludos,
Erik