gojiplus / tuber

:sweet_potato: Access YouTube from R
http://gojiplus.github.io/tuber
Other
183 stars 54 forks source link

Error: HTTP failure: 403 #15

Open abeburnett opened 7 years ago

abeburnett commented 7 years ago

Hi there,

So I've been using and loving Tuber, until just now when all the sudden my search attempts are coming back with a "Error: HTTP failure: 403" message. I haven't changed a thing. Same API key, same secret. But now yt_search with any term in it simply fails.

I don't see any troubleshooting mechanisms in the package, and the documentation you link to on the YouTube Developer site doesn't yield any additional insights either. I logged into my YouTube account to see if it had been closed, and it hadn't. I checked the Google APIs dashboard to see if my access had been restricted or something, and it doesn't appear to have been.

Any ideas what could be going on? And more importantly, how to fix it?

Thanks!

slfan2013 commented 1 year ago

Possible solution:

GET("https://www.googleapis.com", path = paste0("youtube/v3/", "search"),config(token = getOption("google_token")))

soodoku commented 1 year ago

thanks @slfan2013 but i don't think that's the answer.

tuber_get =

tuber_GET <- function(path, query, ...) {

  yt_check_token()

  req <- GET("https://www.googleapis.com", path = paste0("youtube/v3/", path),
             query = query, config(token = getOption("google_token")), ...)

  tuber_check(req)
  res <- content(req)

  res
}
slfan2013 commented 1 year ago

I mean GET("https://www.googleapis.com", path = paste0("youtube/v3/", "search"),config(token = getOption("google_token"))) gives the reason of 401 or 403. The following code

  tuber_check(req)
  res <- content(req)

makes it less readable. At least, the content of req gives me a clear explanation of why errors happen. For me, the reason was I did not enable youtube v3 API.

soodoku commented 1 year ago

I will see if I can propagate the error to the console.

VictorSiq commented 1 year ago

Already tried all the solutions above, but the "http failure: 403" persists. What I'm trying to do is simply creating a list of ids that I needed and creating a df with all of their comments. Both the authentication step and the broadcast column were all updated as the previous instruncions.

`list_teste <- list(videos$id)

teste <- map_dfr(list_teste[[1]], ~ get_all_comments(video_id = .x))`

VictorSiq commented 1 year ago

Already tried all the solutions above, but the "http failure: 430" persists. What I'm trying to do is simply creating a list of ids that I needed and creating a df with all of their comments. Both the authentication step and the broadcast column were all updated as the previous instruncions.

`list_teste <- list(videos$id)

teste <- map_dfr(list_teste[[1]], ~ get_all_comments(video_id = .x))`

Tweeked the code a bit and @BaruqueRodrigues found a viable solution.

First of all, there was a specific element of my list returning the error, this video is a scheduled upload and not a released video, so it has no comments section yet, when I tried to get_comments for this video only, the same 403 erro was returned. In other words, this error can also be found when you request comments from unreleased videos.

Second, @BaruqueRodrigues solution was a workaround to bypass this error using the purrr:::possibly() function. With this, I could simply treat the unreleased video as a NULL and follow with my analysis. The following code solved my problem:

teste <- map_dfr(list_teste[[1]], possibly(~get_all_comments(video_id = .x), otherwise = NULL))

With this, I could get all the comments from all the videos I needed, and bypass the unreleased ones.

apoorv74 commented 3 months ago

People who tried all solutions and could not get past the 403 error may need to enable their YouTube API in Developers Console. Credit for this solution goes to @ggyamfi4u.

this worked. thanks 👍