4pr0n / gonewilder

GNU General Public License v2.0
32 stars 11 forks source link

gfycat support is not working #37

Open wentwild opened 9 years ago

wentwild commented 9 years ago

I am getting 403 errors with gfycat links.

Gonewild: CurvyAnonymous: process_url: http://gfycat.com/SpitefulPepperyKillerwhale#
Gonewild: CurvyAnonymous: process_url: unable to get URLs for http://gfycat.com/SpitefulPepperyKillerwhale#: HTTP Error 403: Forbidden
Gonewild: buckshaned: process_url: http://www.gfycat.com/EllipticalScalyFieldmouse
Gonewild: buckshaned: process_url: unable to get URLs for http://www.gfycat.com/EllipticalScalyFieldmouse: HTTP Error 403: Forbidden
Gonewild: buckshaned: process_url: http://gfycat.com/VagueDangerousChipmunk
Gonewild: buckshaned: process_url: unable to get URLs for http://gfycat.com/VagueDangerousChipmunk: HTTP Error 403: Forbidden
Gonewild: buckshaned: process_url: http://www.gfycat.com/AnnualCoolAvians
Gonewild: buckshaned: process_url: unable to get URLs for http://www.gfycat.com/AnnualCoolAvians: HTTP Error 403: Forbidden
Gonewild: buckshaned: process_url: http://www.gfycat.com/AnnualCoolAvians
Gonewild: buckshaned: process_url: unable to get URLs for http://www.gfycat.com/AnnualCoolAvians: HTTP Error 403: Forbidden
4pr0n commented 9 years ago

@wentwild Regarding the 403 errors, it looks like gfycat is blocking your computer specifically.

I found this discussion in which 403 errors were caused by the user agent: https://www.reddit.com/r/gfycat/comments/2j03jq/getting_403_error_when_trying_to_query_data_with/

Gonewilder uses the same default user agent for all requests:

So it seems odd that this stopped working for you. Maybe your IP was blocked for too many requests to Gfycat? Have you tried updating the user agent?

4pr0n commented 9 years ago

@ohhdemgirls Looks like that's a new-line character between gfycat.com and / -- it's the only explanation for why the ImageUtils class wouldn't consider gfycat supported:

How often are new-line characters in a URL? Should the script support them?

ohhdemgirls commented 9 years ago

How often are new-line characters in a URL? Should the script support them?

Never, or at least shouldn't be, not sure how that came about, hmm. It was only on the slight off chance I noticed this so I'm not sure how often it's happening.

(deleted original comment by mistake)

ghost commented 9 years ago

I'm also just noticing that I"m having problems with gfycat links. I can access and download gfycat links just browsing reddit online, but it's not downloading thru gonewilder.

[2014-12-04T00:57:25Z] Gonewild: natural_red: poll_user: since "cmjd98l" [2014-12-04T00:57:33Z] Gonewild: natural_red: poll_user: 3 new posts and comments found [2014-12-04T00:57:33Z] Gonewild: natural_red: poll_user: found 1 url(s) in child http://reddit.com/r/leggingsgonewild/comments/2o7lex [2014-12-04T00:57:33Z] Gonewild: natural_red: process_url: http://gfycat.com/ExcitableTemptingAmericanriverotter [2014-12-04T00:57:34Z] Gonewild: natural_red: process_url: unable to get URLs for http://gfycat.com/ExcitableTemptingAmericanriverotter: HTTP Error 403: Forbidden [2014-12-04T00:57:34Z] Gonewild: natural_red: poll_user: found 1 url(s) in child http://reddit.com/r/gonewild/comments/2o7l8i [2014-12-04T00:57:34Z] Gonewild: natural_red: process_url: http://gfycat.com/ExcitableTemptingAmericanriverotter [2014-12-04T00:57:34Z] Gonewild: natural_red: process_url: unable to get URLs for http://gfycat.com/ExcitableTemptingAmericanriverotter: HTTP Error 403: Forbidden [2014-12-04T00:57:34Z] Gonewild: natural_red: poll_user: done [2014-12-04T00:57:34Z] Gonewild: natural_red: poll_user: setting most-recent since_id to "cmkh6k9"

ghost commented 9 years ago

Also, not sure if relevant, but gfycat also seems to not be working on RipMe for me either. So possibly my computer as suggested above? I would have no idea why though.

ghost commented 9 years ago

So I changed which server I connected to on my vpn and now it's working... so I guess it was blocking my IP address previously. Any ideas why that may be?

Also, is it possible to go back to previous gfycat submissions that weren't been able to be downloaded before and download them via gonewilder now that I've gotten it working again?

Edit: I just tested another link with gfycat submissions after it started working, and I'm getting 403 errors again on Gonewilder and RipMe again. Only worked once after changing IP address.