ytdl-org / youtube-dl

Command-line program to download videos from YouTube.com and other video sites
http://ytdl-org.github.io/youtube-dl/
The Unlicense
130.48k stars 9.85k forks source link

[YouTube] Randomly slow youtube download speed #29326

Closed triplesixman closed 2 years ago

triplesixman commented 3 years ago

Checklist

Verbose log

root@server:~# youtube-dl https://youtu.be/8PecfdkEM2Y --source-address 64.31.22.34 --verbose
[debug] System config: []
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: [u'https://youtu.be/8PecfdkEM2Y', u'--source-address', u'64.31.22.34', u'--verbose']
WARNING: Assuming --restrict-filenames since file system encoding cannot encode all characters. Set the LC_ALL environment variable to fix this.
[debug] Encodings: locale ANSI_X3.4-1968, fs ANSI_X3.4-1968, out ANSI_X3.4-1968, pref ANSI_X3.4-1968
[debug] youtube-dl version 2021.06.06
[debug] Python version 2.7.13 (CPython) - Linux-4.9.0-15-amd64-x86_64-with-debian-9.13
[debug] exe versions: ffmpeg 4.1.2, ffprobe 4.1.2, phantomjs 2.1.1
[debug] Proxy map: {}
[youtube] 8PecfdkEM2Y: Downloading webpage
[debug] Default format spec: bestvideo+bestaudio/best
[debug] Invoking downloader on u'https://r2---sn-ab5l6n67.googlevideo.com/videoplayback?expire=1623953531&ei=GzzLYMyiKojn8wSQ-42YBA&ip=64.31.22.34&id=o-AMDSXCc14P0ndQaRCkihVxb1SwdClDAhUbq8xP8nD2ss&itag=313&aitags=133%2C134%2C135%2C136%2C137%2C160%2C242%2C243%2C244%2C247%2C248%2C271%2C278%2C313%2C394%2C395%2C396%2C397%2C398%2C399%2C400%2C401&source=youtube&requiressl=yes&mh=e_&mm=31%2C26&mn=sn-ab5l6n67%2Csn-vgqsrnee&ms=au%2Conr&mv=m&mvi=2&pl=24&initcwndbps=14958750&vprv=1&mime=video%2Fwebm&ns=gA0RPx_S5d5eK1Qi3zH6xdcF&gir=yes&clen=202533160&dur=299.160&lmt=1617981974189982&mt=1623931501&fvip=2&keepalive=yes&fexp=24001373%2C24007246&c=WEB&txp=5532432&n=0B2xjorVelV0Xu6Hq&sparams=expire%2Cei%2Cip%2Cid%2Caitags%2Csource%2Crequiressl%2Cvprv%2Cmime%2Cns%2Cgir%2Cclen%2Cdur%2Clmt&lsparams=mh%2Cmm%2Cmn%2Cms%2Cmv%2Cmvi%2Cpl%2Cinitcwndbps&lsig=AG3C_xAwRQIhAI7W7dT0pYaOQgxn1mHYX3js6NByrqiykD9fsPJs3kAXAiApHiHXVdDMO1k6OKyg2sAb1PMyMO1jfgtZV5R-7frcpw%3D%3D&sig=AOq0QJ8wRgIhAPdD-seXFXT-yOEqoIqCQfPnRqMLASvU8SbymG5TPpNhAiEA1pVFpS3hDYqTVe1ia5sDOi9RaPf3BCuT94XB-vICq_E='
[download] Destination: Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f313.webm
[download]  18.4% of 193.15MiB at 77.46KiB/s ETA 34:44Terminated

While the download was in progress, I ran the exact same command in another terminal, another folder, and the download was completed in a few seconds:

root@server:~/test# youtube-dl https://youtu.be/8PecfdkEM2Y --source-address 64.31.22.34 --verbose
[debug] System config: []
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: [u'https://youtu.be/8PecfdkEM2Y', u'--source-address', u'64.31.22.34', u'--verbose']
WARNING: Assuming --restrict-filenames since file system encoding cannot encode all characters. Set the LC_ALL environment variable to fix this.
[debug] Encodings: locale ANSI_X3.4-1968, fs ANSI_X3.4-1968, out ANSI_X3.4-1968, pref ANSI_X3.4-1968
[debug] youtube-dl version 2021.06.06
[debug] Python version 2.7.13 (CPython) - Linux-4.9.0-15-amd64-x86_64-with-debian-9.13
[debug] exe versions: ffmpeg 4.1.2, ffprobe 4.1.2, phantomjs 2.1.1
[debug] Proxy map: {}
[youtube] 8PecfdkEM2Y: Downloading webpage
[debug] Default format spec: bestvideo+bestaudio/best
[debug] Invoking downloader on u'https://r2---sn-ab5l6n67.googlevideo.com/videoplayback?expire=1623953574&ei=RjzLYLyWBIHGhwak2IzwBg&ip=64.31.22.34&id=o-AFgzn9Zdn1KSGbuO09ZkpRma9GzWqWUApkavXXN93_F6&itag=313&aitags=133%2C134%2C135%2C136%2C137%2C160%2C242%2C243%2C244%2C247%2C248%2C271%2C278%2C313%2C394%2C395%2C396%2C397%2C398%2C399%2C400%2C401&source=youtube&requiressl=yes&mh=e_&mm=31%2C29&mn=sn-ab5l6n67%2Csn-ab5szne7&ms=au%2Crdu&mv=m&mvi=2&pl=24&initcwndbps=14958750&vprv=1&mime=video%2Fwebm&ns=hoSqVT_3ust7ILej5iYoT40F&gir=yes&clen=202533160&dur=299.160&lmt=1617981974189982&mt=1623931501&fvip=2&keepalive=yes&fexp=24001373%2C24007246&c=WEB&txp=5532432&n=l6PFqwM4uREk1JKwP&sparams=expire%2Cei%2Cip%2Cid%2Caitags%2Csource%2Crequiressl%2Cvprv%2Cmime%2Cns%2Cgir%2Cclen%2Cdur%2Clmt&lsparams=mh%2Cmm%2Cmn%2Cms%2Cmv%2Cmvi%2Cpl%2Cinitcwndbps&lsig=AG3C_xAwRQIhAJU5426qtqf6BwLiB48OKkcK_ATe_S9jDPYAVbttM7T1AiBoVGwb1ZBagaiUyKeVGLv562cloZeh5xBT2lFZx61gyQ%3D%3D&sig=AOq0QJ8wRAIgWSAyj1JyqoTHFWMdJ04gjcIDJ8tFryw5sNsf5soVaPQCIE81_29FoDoAOeRv2_hdcnBi2-4XxvlGvbX9YNajLi6y'
[download] Destination: Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f313.webm
[download] 100% of 193.15MiB in 00:04
[debug] Invoking downloader on u'https://r2---sn-ab5l6n67.googlevideo.com/videoplayback?expire=1623953574&ei=RjzLYLyWBIHGhwak2IzwBg&ip=64.31.22.34&id=o-AFgzn9Zdn1KSGbuO09ZkpRma9GzWqWUApkavXXN93_F6&itag=251&source=youtube&requiressl=yes&mh=e_&mm=31%2C29&mn=sn-ab5l6n67%2Csn-ab5szne7&ms=au%2Crdu&mv=m&mvi=2&pl=24&initcwndbps=14958750&vprv=1&mime=audio%2Fwebm&ns=hoSqVT_3ust7ILej5iYoT40F&gir=yes&clen=5324207&dur=299.201&lmt=1617980855722369&mt=1623931501&fvip=2&keepalive=yes&fexp=24001373%2C24007246&c=WEB&txp=5531432&n=l6PFqwM4uREk1JKwP&sparams=expire%2Cei%2Cip%2Cid%2Citag%2Csource%2Crequiressl%2Cvprv%2Cmime%2Cns%2Cgir%2Cclen%2Cdur%2Clmt&lsparams=mh%2Cmm%2Cmn%2Cms%2Cmv%2Cmvi%2Cpl%2Cinitcwndbps&lsig=AG3C_xAwRgIhAIqCBWf7PmHH8y1wnG8QvB-0vxKzRG26qCAIWdgOAT1PAiEAwTSl4J0e9L7emiYUDKV_YjfApo2gchge3iVfrYH76lo%3D&sig=AOq0QJ8wRQIgODhcEL0uT0u1nXP41IARrB63CfmpDmUzl6HhPwrXOTwCIQD-OUE152N7yzYXcgU_tAPCP0YdRdfVyFlHE4kIYyyoew=='
[download] Destination: Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f251.webm
[download] 100% of 5.08MiB in 00:00
[ffmpeg] Merging formats into "Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.webm"
[debug] ffmpeg command line: ffmpeg -y -loglevel 'repeat+info' -i 'file:Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f313.webm' -i 'file:Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f251.webm' -c copy -map '0:v:0' -map '1:a:0' 'file:Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.temp.webm'
Deleting original file Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f313.webm (pass -k to keep)
Deleting original file Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f251.webm (pass -k to keep)

Description

Hello,

Important: The problem is random, maybe 1 chance on 8 to produce it. You have to download several videos in a row (about 10) to notice it.

Since a few weeks, randomly a youtube video can be slowed down to 48 Kio/s, so it takes 5-10 minutes to download a short video of 5 minutes instead of 4, 5 seconds, often the download does not succeed and stops after a few minutes.

This happens on several servers, several internet providers as well as with my private connection.

I even managed to launch a second download in parallel when the first one was taking time, the second one finished in 1 seconds, the first one in 5 minutes and was interrupted before the end. Same video, same connection, same command. (tested only with ipv4 because I don't have ipv6 on my servers or with my internet connection).

Attached is an extraction of the results with --dump-pages. ko.txt ok.txt

inkuxuan commented 3 years ago

I'm having exactly the same issue.

shimiaoyerin commented 3 years ago

我也注意到了,我一直是配合aria2使用的,只是也和你一样的问题,希望更多人提供有帮助的信息

joro1 commented 3 years ago

Same issue, i.e. also noticing downloads are much slower than "normal", even short videos taking extremely long times.

JJ840 commented 3 years ago

Same issue, restarting the download keeps it working fine for a while but then it happens again. I would think throttling perhaps, but I have Starlink and they sure as shit don't throttle. Perhaps it's something on youtube's end?

edit: Curiously, I don't think it happens after I switched on a vpn to test it...

neural-nut commented 3 years ago

same here, this is driving me out of my mind

srett commented 3 years ago

Same issue here. When this happens, speed is like in OP's case around 77kb/s

From the 2018 ticket #15271 I found that --http-chunk-size 10M could mitigate the issue, but it doesn't help in this case; either no chunking happens, or the chunking doesn't fix it. No idea how to verify that it actually does what it's supposed to do.

Lesmiscore commented 3 years ago

I sometimes face same issue.

I think either YouTube or your internet provider is throttling connection between YT and youtube-dl. In addition, YouTube may transcode video while serving. I guess this is likely the cause if no one does throttling. (see https://blog.youtube/inside-youtube/new-era-video-infrastructure )

For both cases, youtube-dl can't do anything to fix.

triplesixman commented 3 years ago

I sometimes face same issue.

I think ~either~ YouTube ~or your internet provider~ is throttling connection between YT and youtube-dl. In addition, YouTube may transcode video while serving. I guess this is likely the cause if no one does throttling. (see https://blog.youtube/inside-youtube/new-era-video-infrastructure )

For both cases, youtube-dl can't do anything to fix.

Are you saying that youtube-dl will be obsolete with youtube.com because of this update when it spreads to all videos?

Honestly I have my doubts. I think it's more of a bug or maybe related to this update, but I imagine (and hope) that the contributors of youtube-dl will be able to find a solution, as they did in the last 15 years of existence.

srett commented 3 years ago

@nao20010128nao What I found suspicious is the download speed - for me - always being around 77kb/s if this happens. This is way to slow even for playback, and it happens independently of the format. f137 takes forever like this. If this was purely a problem on youtube's side, I'd expect to hit this problem in the browser too at times, but I never experience buffering. My hunch would be some change on youtube's side, either accidental or nefarious, that leads to this problem.

demget commented 3 years ago

The temporal inaccurate workaround which can help in dealing with automatic downloads:

def speed_check(s):
    speed = s.get('speed')
    ready = s.get('downloaded_bytes', 0)
    total = s.get('total_bytes', 0)

    if speed and speed <= 77 * 1024 and ready >= total * 0.1:
        # if the speed is less than 77 kb/s and we have 
        # at least one tenths of the video downloaded
        raise DownloadError('Abnormal downloading speed drop.')

ydl.add_progress_hook(speed_check)
nikooo777 commented 3 years ago

The temporal inaccurate workaround which can help in dealing with automatic downloads:

def speed_check(s):
    speed = s.get('speed')
    ready = s.get('downloaded_bytes', 0)
    total = s.get('total_bytes', 0)

    if speed and speed <= 77 * 1024 and ready >= total * 0.1:
        # if the speed is less than 77 kb/s and we have 
        # at least one tenths of the video downloaded
        raise DownloadError('Abnormal downloading speed drop.')

ydl.add_progress_hook(speed_check)

would you then retry the video until it's fast?

demget commented 3 years ago

@nikooo777 I have some kind of proxy in front of the Python downloader and it decides about retrying or not, so no need in doing it here, though it really makes sense for someone.

danny-wu commented 3 years ago

I can confirm I am also experiencing this, since July 14th. I am either noticing ~77kb/s, or ~50kb/s.

The throttling is consistent across all of my servers and seems to only occur if I download a lot of videos; making me think it is an intentional form of throttling on youtube's side.

I have increased the timeouts and reduced the frequency of my throttling, and I am noticing far less throttles. If you are hitting this, I recommend you increase your timeouts.

triplesixman commented 3 years ago

@danny-wu In my case, this happens even during the first download on my private internet connection. Also noticed on a linux server.

Tinase-nau commented 3 years ago

I think YT may be throttling for profit reasons and/or they may be having technical difficulties and/or they are upgrading and/or censorship reasons. A few days ago some videos at YT started buffering. Videos were allowed to be played at quality between 144p - 360p. But not all videos have the issue. Then I thought to try downloading the videos with YT-DL: "Blocked Drain 506" downloading at 40.91KiB/s, with several "Got server HTTP error: [WinError 10054] An existing connection was forcibly closed by the remote host." messages. Ofc the download script downloads the highest possible quality. The throttling starts from 720p. At 480p I could actually watch the video without buffering.

JJ840 commented 3 years ago

From what I understand, if youtube is throttling downloads aria2c might help (https://wiki.archlinux.org/title/Youtube-dl#Faster_downloads)? I have absolutely no experience with it (if someone does and could link something that would be greatly appreciated) but I think it might be worth a try if someone here does have it set up.

petaire92 commented 3 years ago

yep ! same here ! it works flawlessly on macOS but the build on Ubuntu gets slow randomly, also around 76/77kbps...

danny-wu commented 3 years ago

hmm is anyone experiencing this issue NOT on ubuntu?

On Tue, 22 Jun 2021, 1:03 am petaire92, @.***> wrote:

yep ! same here ! it works flawlessly on macOS but the build on Ubuntu gets slow randomly, also around 76/77kbps...

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-865106728, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD7CVVGTZMZWWOKP5T3IU6LTT5IDDANCNFSM463R5FEQ .

-- https://www.canva.com/Empowering the world to design Share accurate information on COVID-19 and spread messages of support to your community. Here are some resources https://about.canva.com/coronavirus-awareness-collection/?utm_medium=pr&utm_source=news&utm_campaign=covid19_templates that can help. https://twitter.com/canva https://facebook.com/canva https://au.linkedin.com/company/canva https://twitter.com/canva  https://facebook.com/canva  https://au.linkedin.com/company/canva  https://instagram.com/canva

Tinase-nau commented 3 years ago

Win10 x64 here. But I would like to know if really MacOS is unaffected?

alexmerkel commented 3 years ago

But I would like to know if really MacOS is unaffected?

Nope, macOS 11 here and I have the same issue!

petaire92 commented 3 years ago

Win10 x64 here. But I would like to know if really MacOS is unaffected?

Well I have a big sur build that works perfectly with multiple dl per days and works flawlessly. But maybe it’s because those are newly added videos ?

Tinase-nau commented 3 years ago

Win10 x64 here. But I would like to know if really MacOS is unaffected?

Well I have a big sur build that works perfectly with multiple dl per days and works flawlessly. But maybe it’s because those are newly added videos ?

Seems so. I don't notice buffering or quality loss on new videos. Buffering/throttling on several months old videos only. Maybe they introduced some new encoding codec and are reencoding old videos and new videos are already good? Maybe so.

oneandonlyjason commented 3 years ago

Seems like the Same Problem here on a Debian 10 Machine with youtube-dl. I can download a few Video normal and then it gets so slow that there is no noticable Network Activity.

RubikSW commented 3 years ago

Same issue here on Linux.

on1razor commented 3 years ago

I have a similar problem, I thought that youtube was blocking me by ip, so I started uploading via vpn, but the problem repeats

danny-wu commented 3 years ago

There were helpful comments on this ticket from a user with initials A.W.; unfortunately it looks like they got deleted. I saw them because I have email notifications.

Has anyone made further progress on the query parameter investigation?

danny-wu commented 3 years ago

This is incredibly helpful. wow. Thanks.

Deliberate sabotage of YouTube-dl by Google. Sucks to see this continue.

Please, Googlers, you are reading this. We are just trying to download a small fraction of YouTube videos for offline playback, personal use. Please don't sabotage us.

On Tue, 22 Jun 2021, 11:29 pm Aaron Wojnowski, @.***> wrote:

I have the solution for this issue. I do not have the bandwidth to actually implement it in the source, but this should be more than enough information to do so.

The issue is that YouTube is modifying the n query parameter on the video playback URLs in a very similar fashion as the signature cipher. There's a pure function in the player JavaScript which takes the n parameter as input and outputs an n parameter which is not subject to throttling.

As an example, let's look at https://www.youtube.com/s/player/52dacbe2/player_ias.vflset/et_EE/base.js. The code in question which modifies n is as follows:

a.C&&(b=a.get("n"))&&(b=Dea(b),a.set("n",b))}};

In this case, Dea is the function we are looking for:

function(a){var b=a.split(""),c=[-704589781,1347684200,618483978,1439350859,null,63715372,function(d){d.reverse()},

159924259,-312652635,function(d,e){for(e=(e%d.length+d.length)%d.length;e--;)d.unshift(d.pop())}, -1208266546,function(d,e){d.push(e)},

-2143203774,-103233324,b,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(0,1,d.splice(e,1,d[0])[0])}, 837025862,1654738381,1184416163,1983454500,b,-200631744,1130073900,null,2047141935,-337180565,1654738381,1913297860,-399114812,b,714887321,function(d,e){for(var f=64,h=[];++f-h.length-32;){switch(f){case 58:f-=14;case 91:case 92:case 93:continue;case 123:f=47;case 94:case 95:case 96:continue;case 46:f=95}h.push(String.fromCharCode(f))}d.forEach(function(l,m,n){this.push(n[m]=h[(h.indexOf(l)-h.indexOf(this[m])+m-32+f--)%h.length])},e.split(""))},

626129880,"pop",1331847507,-103233324,2092257394,function(d,e){for(e=(e%d.length+d.length)%d.length;e--;)d.unshift(d.pop())}, 669147323,1184416163,-216051470,193134360,null,2045900346,1675782975,-1997658115,function(d,e){e=(e%d.length+d.length)%d.length;var f=d[0];d[0]=d[e];d[e]=f},

1675782975,161770346,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(-e).reverse().forEach(function(f){d.unshift(f)})}, function(d){for(var e=d.length;e;)d.push(d.splice(--e,1)[0])}, 1454215184,-2123929123];c[4]=c;c[23]=c;c[42]=c;try{c6,c6,c6,c43,c30,c43,c 25,c52,c43,c40,c21,c0,c0,c21,c37,c16,c51,c10,c7,c7,c6,c2,c38,c3,c3,c3, c49,c17,c28,c5,c46,c37,c37,c41,c41,c16,c12,c14,c52,c39,c22}catch(d){return"enhanced_exceptAAAAAAAAAAE"+a}return b.join("")};

This does change with different player versions, so youtube-dl will need to extract this for every video that it fetches and then modify the n parameter as such.

Hope this is helpful.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-865985377, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD7CVVBZOS3SEPT7WXBSA4TTUCF5HANCNFSM463R5FEQ .

-- https://www.canva.com/Empowering the world to design Share accurate information on COVID-19 and spread messages of support to your community. Here are some resources https://about.canva.com/coronavirus-awareness-collection/?utm_medium=pr&utm_source=news&utm_campaign=covid19_templates that can help. https://twitter.com/canva https://facebook.com/canva https://au.linkedin.com/company/canva https://twitter.com/canva  https://facebook.com/canva  https://au.linkedin.com/company/canva  https://instagram.com/canva

coletdjnz commented 3 years ago

I have the solution for this issue. I do not have the bandwidth to actually implement it in the source, but this should be more than enough information to do so.

The issue is that YouTube is modifying the n query parameter on the video playback URLs in a very similar fashion as the signature cipher. There's a pure function in the player JavaScript which takes the n parameter as input and outputs an n parameter which is not subject to throttling.

As an example, let's look at https://www.youtube.com/s/player/52dacbe2/player_ias.vflset/et_EE/base.js. The code in question which modifies n is as follows:

a.C&&(b=a.get("n"))&&(b=Dea(b),a.set("n",b))}};

In this case, Dea is the function we are looking for:

function(a){var b=a.split(""),c=[-704589781,1347684200,618483978,1439350859,null,63715372,function(d){d.reverse()},
159924259,-312652635,function(d,e){for(e=(e%d.length+d.length)%d.length;e--;)d.unshift(d.pop())},
-1208266546,function(d,e){d.push(e)},
-2143203774,-103233324,b,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(0,1,d.splice(e,1,d[0])[0])},
837025862,1654738381,1184416163,1983454500,b,-200631744,1130073900,null,2047141935,-337180565,1654738381,1913297860,-399114812,b,714887321,function(d,e){for(var f=64,h=[];++f-h.length-32;){switch(f){case 58:f-=14;case 91:case 92:case 93:continue;case 123:f=47;case 94:case 95:case 96:continue;case 46:f=95}h.push(String.fromCharCode(f))}d.forEach(function(l,m,n){this.push(n[m]=h[(h.indexOf(l)-h.indexOf(this[m])+m-32+f--)%h.length])},e.split(""))},
626129880,"pop",1331847507,-103233324,2092257394,function(d,e){for(e=(e%d.length+d.length)%d.length;e--;)d.unshift(d.pop())},
669147323,1184416163,-216051470,193134360,null,2045900346,1675782975,-1997658115,function(d,e){e=(e%d.length+d.length)%d.length;var f=d[0];d[0]=d[e];d[e]=f},
1675782975,161770346,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(-e).reverse().forEach(function(f){d.unshift(f)})},
function(d){for(var e=d.length;e;)d.push(d.splice(--e,1)[0])},
1454215184,-2123929123];c[4]=c;c[23]=c;c[42]=c;try{c[6](c[4]),c[6](c[23],c[22]),c[6](c[38],c[12]),c[43](c[10],c[5]),c[30](c[16]),c[43](c[22],c[36]),c[25](c[47],c[51]),c[52](c[7],c[26]),c[43](c[32],c[8]),c[40](c[7],c[11]),c[21](c[22],c[29]),c[0](c[22],c[17]),c[0](c[47],c[52]),c[21](c[32],c[19]),c[37](c[23],c[31]),c[16](c[9],c[34]),c[51](c[44],c[43]),c[10](c[34],c[15]),c[7](c[43],c[5]),c[7](c[53],c[41]),c[6](c[39],c[40]),c[2](c[24]),c[38](c[39],c[14]),c[3](c[24],c[52]),c[3](c[39],c[0]),c[3](c[49],c[7]),
c[49](c[41],c[37]),c[17](c[44],c[52]),c[28](c[41],c[32]),c[5](c[48],c[50]),c[46](c[23]),c[37](c[38],c[26]),c[37](c[38],c[18]),c[41](c[9],c[48]),c[41](c[48],c[6]),c[16](c[48],c[3]),c[12](c[8],c[38]),c[14](c[18],c[53]),c[52](c[1],c[13]),c[39](c[10],c[11]),c[22](c[33],c[48])}catch(d){return"enhanced_except_AAAAAAAAAAE_"+a}return b.join("")};

This does change with different player versions, so youtube-dl will need to extract this for every video that it fetches and then modify the n parameter as such.

Hope this is helpful.

Great find.

FYI for devs, it seems like the built-in jsinterp can't interpret this function?

I wrote up this quickly in youtube.YouTubeIE

def test_n_js(self, player_url):
    player_id = self._extract_player_info(player_url)
    if player_id not in self._code_cache:
        self._code_cache[player_id] = self._download_webpage(
            player_url, None,
            note='Downloading player ' + player_id,
            errnote='Download of %s failed' % player_url)

    jscode = self._code_cache[player_id]
    funcname = self._search_regex(
        (r'\.get\("n"\)\)&&\(b=(?P<nfunc>[a-zA-Z0-9$]{3})\([a-zA-Z0-9]\)',),
        jscode, 'Initial JS player n function name', group='nfunc'
    )
    jsi = JSInterpreter(jscode)
    initial_function = jsi.extract_function(funcname)
    return lambda s: initial_function([s])

It extracts the function name correctly, however when interpreting, you get youtube_dl.utils.ExtractorError: Unsupported JS expression 'a.split(""),c=[';

I don't really know javascript so about all I can say.

VADemon commented 3 years ago

(I'm basing my comment off of this: https://www.youtube.com/s/player/2fa3f946/player_ias.vflset/en_GB/base.js func name eha L1160)

The error you get is assigning multiple variables on one line:

eha = function (a)
{
    var b = a.split(""),
        c = [... long array begins

Given that, after the minifier, the local variable names are deterministic (a.split("") appears to be constant across players as well), this can be fixed with a simple find-replace of the function code. Crude: var b=a.split(""),c=var b=a.split("");var c=

Further, the regex you wrote is assuming a bit too much, e.g. that the function name will be 3-chars long: [a-zA-Z0-9$]{3}, theoretically this can be variable due to minifier. Either your regex can be simplified:

\.get\("n"\)\)&&\(b=(?P<nfunc>[a-zA-Z0-9]+)\(b) because you already rely on b

Or made more strict, since ba: &&\(b=a\.get\("n"\)\)&&\(b=(?P<nfunc>[a-zA-Z0-9$]{3})\(b),a\.set (starting with && as these characters are rarer)


This is currently the only call site in the code (L1750): a.C&&(b=a.get("n"))&&(b=eha(b),a.set("n",b)). Alternatively, this transform function has a unique signature: {return"enhanced_except_AAAAAAAAAAE_"+a} to look for.

They really want one to use a JS interpreter, they're using a couple of basic primitives in a randomized order in this function (i.e. differing from player to player, but the overall structure is the same). An automatic transpilation to another interpreted scripting language would still be possible, but not trivial.

PS: Thanks, @awojnowski. Contrary to others' observations this has been going on for many months, can't tell exactly when it started. I'd guess they replaced the 429 rate-limiting with this. I haven't seen 429 in a long time due to cookies, but now the downloads continue fine regardless of cookies, albeit sometimes at throttled speed.

UPD: No, 429 rate-limiting is still in place. I think it has a higher tolerance now and due to slow download speeds it was less noticable too.

pukkandan commented 3 years ago

Building on top of @colethedj's work, I was able to make commas work as well as improve JSInterpreter.extract_function to correctly capture the entire function. However, the jsinterp doesn't seem to support for, switch and nested functions :/

JJ840 commented 3 years ago

I haven't been follow much, but did someone in another thread find a solution or was there an update or something?

pukkandan commented 3 years ago

@JJ840 No. How to solve this has been figured out in https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-865985377, but there is no actual implementation yet

JJ840 commented 3 years ago

Gotcha, thanks! Hopefully it gets fixed soon!

triplesixman commented 3 years ago

The temporal inaccurate workaround which can help in dealing with automatic downloads:

def speed_check(s):
    speed = s.get('speed')
    ready = s.get('downloaded_bytes', 0)
    total = s.get('total_bytes', 0)

    if speed and speed <= 77 * 1024 and ready >= total * 0.1:
        # if the speed is less than 77 kb/s and we have 
        # at least one tenths of the video downloaded
        raise DownloadError('Abnormal downloading speed drop.')

ydl.add_progress_hook(speed_check)

If we can have details on how to implement this method (or another) until the problem is fixed, it would be nice because currently the application is almost unusable...

nikooo777 commented 3 years ago

If we can have details on how to implement this method (or another) until the problem is fixed, it would be nice because currently the application is almost unusable...

I don't think that really helps that much, I tried it and you'll just end up in a loop where the video is both slow at downloading and keeps getting killed. Not sure what the rules are here but there is a known fork of youtube-dl that implemented this feature with --throttled-speed if you want to try it for yourself

mechalincoln commented 3 years ago

I've worked around this issue by checking the status of a playlist download, and made use of control-c to terminate if it's like 80 KB/s and then running it again.

Since this seems to vary from video to video, this isn't the best option, but it makes youtube-dl usable.

This app is one of my favorites; I hope it can be patched soon.

moom0o commented 3 years ago

Same issue on multiple Debian servers.

rautamiekka commented 3 years ago

Speaking of Servers, are you all using one from a hosting company ? Cuz that's likely the reason cuz my Dedicated Server was altogether banned, but that was ages ago and I haven't tried since. Since it's from a hosting company they can easily run a search to figure that out => IP ban.

oneandonlyjason commented 3 years ago

Speaking of Servers, are you all using one from a hosting company ? Cuz that's likely the reason cuz my Dedicated Server was altogether banned, but that was ages ago and I haven't tried since. Since it's from a hosting company they can easily run a search to figure that out => IP ban.

@danny-wu In my case, this happens even during the first download on my private internet connection. Also noticed on a linux server.

This is a case where it is a private Connection. The MacOS and Windows Versions are probably as well.

scegg commented 3 years ago

I have multiple internet connections with fixed IP addresses. I mainly run youtube-dl on one of the IP. Here is my test result.

On the main IP, which is used to download by youtube-dl for a long history, the speed it limited to 1MB/s mostly. Sometimes, it is limited to 50KB/s. When it happened, I break it by CTRL-C, open a browser to see that video from youtube directly, then restart the downloading. It will resumed to 1MB/s.

At the same time, if I run youtube-dl from other IP, the speed it good without limitation.

All my IP addresses are from a same ISP, through a same fiber link. It should not be the problem from my ISP side. I GUESS youtube has a black list for the IP address downloading video not from webpage.

arrowgent commented 3 years ago

imho i think its youtube attempting to throttle, not your ISP this has been happening for months, its not new.

thanks for the research and attempted fixes/alternatives

hashimaziz1 commented 3 years ago

Been experiencing this for a while, glad there's finally a solution to YouTube's aggressive rate limiting, hope it's implemented soon.

shoxie007 commented 3 years ago

Dear God! This has become a REALLY FRUSTRATING problem. And it now has to do with Javascript, which I know NOTHING about. Someone PLEASE, PLEASE come up with a solution.

@awojnowski You seem to understand the problem. Would you PLEASE examine Youtube-DL's Python code and make a suggestion on what edits and additions to make. Please give us a few more breadcrumbs to work with.

@pukkandan I REALLY appreciate your efforts with the yt-dlp project, But for this particular problem, is there a better solution than --throttled-rate, which just doesn't do it for me. It just keeps re-extracting the webpage, and then keeps getting throttled.

@tfdahlin If you succeed in getting pytube to bypass this issue, would you please help us out and make suggestions for edits to Youtube-DL's extractor?

liamengland1 commented 3 years ago

@shoxie007 I appreciate your enthusiasm but please refrain from commenting solely to nag people to speed up their work. For other people reading this issue, it provides no useful information at all, and especially in the context of an open-source project, it comes off as entitled.

Additionally, @\awojnowski does not need to provide any more breadcrumbs, this shows that you are totally unfamiliar with the current state of the issue. The javascript function Youtube uses to unscramble the parameter is complex and it seems that it cannot be directly implemented in Python (i.e. without manually rewriting) at this time.

shoxie007 commented 3 years ago

@shoxie007 I appreciate your enthusiasm but please refrain from commenting to nag people to speed up their work. Especially for an open-source project, it comes off as rude and entitled.

Additionally, @\awojnowski does not need to provide any more breadcrumbs, this shows that you are totally unfamiliar with the current state of the issue. The javascript function Youtube uses to unscramble the parameter is complex and it seems that it cannot be directly implemented in Python (i.e. without manually rewriting) at this time.

Sorry. I didn't mean to come off as entitled. It's the frustration of having to deal with one issue after another. And they're all coming thick and fast nowadays. And all this is happening when Youtube is going on a video and channel deletion spree. It's hard to keep up with the digital book-burning.

Like I said, I know nothing about Javascript so I wouldn't understand the complexity of this issue. But this does not mean I don't appreciate all the efforts of the people who make this possible. I do.

shoxie007 commented 3 years ago

Here is one reprieve for anyone who only wants to download only a few videos, and to your home computer:

@putara seems to have come up with a decryption solution to resolve the Javascript interpretation: #2222 for the invidious project. I hope this can be somehow transposed and implemented in Python for Youtube-DL.

UPDATE: I'm having success employing variations in IP addresses, in conjunction with the --rm-cache-dir option in Youtube-DL. I have a subscription for 10 instantproxies.com proxies. Before downloading each video, I run: youtube-dl --rm-cache-dir to remove any previous data cached by youtube-dl. Then I route my connection for the next video download through a different proxy from the last: youtube-dl --proxy XXX.XXX.XXX.XXX:PORT ...... and in this way, though the download does not consume my full available bandwidth, it's at least not the sinfully slow 70KBps. I think that each time a youtube-dl request to Youtube presents itself as a fresh connection, it will not be throttled, at least not as much. I also use the --force-ipv4 option to add further variation in the IP which Youtube sees.

tfdahlin commented 3 years ago

After getting close to finishing my code to extract + emulate the function for ciphering the n parameter, I noticed something unusual.

In the video I've been using for testing my code, I saw that exactly one of the streams available for the video had &ratebypass=yes in the URL. This stream also had a value of n that was different from all of the other streams (i.e. the stream URL with the ratebypass parameter had an n-value of _kezA9j2kOOAqbu-q, while the remaining streams had a value of rbQeh2OaABvsWEtCZ).

I haven't yet tested, and can't tonight, but it's possible that one of the stream URLs may hold the value of n required for the other streams to bypass the rate-limiting. If somebody else wants to test this theory on additional videos while I'm unable to, that may be a stopgap measure for the problem.

akaltar commented 3 years ago

I know this may be an overengineered proposal, but I suppose this could be used: https://typescripttolua.github.io To convert typescript(superset of js) to lua https://pypi.org/project/lupa/ And this to interpret it safely in a lua environment.

rautamiekka commented 3 years ago

If you're gonna use external deps you might as well use a pure C or pure Python solution.

nadermx commented 3 years ago

Could it be quicker to just take the JS and just execute it as a sub-process, like youtube-dll does with ffmpeg?