Open GrandizerGo opened 1 year ago
I can't reproduce this. Can you attach your log file?
I can grab file but I noticed there is extra lot of failed downloads from redgif since yesterday. Did they update their structure or smt? @MalloyDelacroix
Same here, redgifs aren't downloading anymore.
It looks like redgifs is now requiring authentication for any type of API interaction. Before they only required it for post or put actions.
They currently do not offer user tokens, and the client token scheme they offer does not appear to be acceptable for this type of application. I will have to investigate this further.
Is this something that, once adjusted to work again, can be coded to seek out only the non downloaded files from the redgifs server from a known date forward?
As aa side note, when ever I add a user to the users list, it gives me a message telling me that the user already exists, which it does not. I am also pretty sure I have seen this a few times on the the other side as well, it seems that it checks if it exists AFTER it is added.
It looks like redgifs is now requiring authentication for any type of API interaction. Before they only required it for post or put actions.
@MalloyDelacroix Does Downloader for Reddit use youtube-dl to download videos from redgifs? FYI, yt-dlp (youtube-dl actively maintained fork) works fine in downloading videos from redgifs.
It looks like redgifs is now requiring authentication for any type of API interaction. Before they only required it for post or put actions.
They currently do not offer user tokens, and the client token scheme they offer does not appear to be acceptable for this type of application. I will have to investigate this further.
@MalloyDelacroix I think I know the solution to fix this problem, thanks for mention this, you have to implement fake user-agent and referer, as simple as that, here is a python:
def random_user_agent():
_USER_AGENT_TPL = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/%s Safari/537.36'
_CHROME_VERSIONS = (
'90.0.4430.212',
'90.0.4430.24',
'90.0.4430.70',
'90.0.4430.72',
'90.0.4430.85',
'90.0.4430.93',
'91.0.4472.101',
'91.0.4472.106',
'91.0.4472.114',
'91.0.4472.124',
'91.0.4472.164',
'91.0.4472.19',
'91.0.4472.77',
'92.0.4515.107',
'92.0.4515.115',
'92.0.4515.131',
'92.0.4515.159',
'92.0.4515.43',
'93.0.4556.0',
'93.0.4577.15',
'93.0.4577.63',
'93.0.4577.82',
'94.0.4606.41',
'94.0.4606.54',
'94.0.4606.61',
'94.0.4606.71',
'94.0.4606.81',
'94.0.4606.85',
'95.0.4638.17',
'95.0.4638.50',
'95.0.4638.54',
'95.0.4638.69',
'95.0.4638.74',
'96.0.4664.18',
'96.0.4664.45',
'96.0.4664.55',
'96.0.4664.93',
'97.0.4692.20',
)
return _USER_AGENT_TPL % random.choice(_CHROME_VERSIONS)
example
std_headers = {
'User-Agent': random_user_agent(),
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Language': 'en-us,en;q=0.5',
'Sec-Fetch-Mode': 'navigate',
}
also need to add referer
_API_HEADERS = {
'referer': 'https://www.redgifs.com/',
'origin': 'https://www.redgifs.com',
'content-type': 'application/json',
}
headers = dict(self._API_HEADERS)
headers['x-customheader'] = f'https://www.redgifs.com/watch/{video_id}'
data = self._download_json(f'https://api.redgifs.com/v2/{ep}', video_id, headers=headers, *args, **kwargs)
@AmperAndSand We didn't use youtube-dl for redgifs, it has its own extractor. Thanks for the information about yt-dlp. I was actually able to use it to handle the redgifs extraction now.
@ericsia Thanks for this information. Unfortunately, I was unable to get this to work. I was able to roll yt-dlp's extraction process into a solution here, which I believe uses some variation of what you posted.
@MalloyDelacroix FYI, approximately 8 hours ago, yt-dlp cannot download from Redgifs. It gives out an "Unable to extract oauth token" error message. There is a fix for yt-dlp mentioned in this issue report: https://github.com/yt-dlp/yt-dlp/issues/5311.
Describe the bug I am getting files in all subreddits, it has gone over 1200 files and only grabbed 1 file. I am seeing hundreds of Failed Download: Unsuccessful response from server messages.
Environment Information
To Reproduce (optional) I am just using the Get all Subreddits option.
For all but the most trivial of issues, please attach the latest log file.
Note: All issues marked with the Information Needed label will be closed after 30 days if no information is provided.