ForrestKnight / yt-dislikes

Add the dislike count back to my YouTube videos via a comment containing that information.
https://youtu.be/1uAH93tzfQY
MIT License
155 stars 13 forks source link

Cache VidId to skip next page token everytime. #10

Closed OTonGitHub closed 2 years ago

OTonGitHub commented 2 years ago

First of, its my first time writing anything on GitHub, sorry if I don't make sense, you can just give a dislike and i'll remove the issue, just hoping to help.

I think a small txt or json file could be generated for the first run of grabbing all video IDs and then cache it in order to skip the next page token thing that happens when you request for video IDs. And in order to check if a new video is uploaded you can just check against the top ID of cached file vs first returned ID from API. (assuming video ID is static)

I am really sorry if i got the whole process wrong, if it's a real issue, i'll go read up the all the youtube API docs and excitedly contribute a solution, I did not go through it as of posting the issue as I am studying for my exams. Thanks,

OT.

th7mo commented 2 years ago

I think this is a good solution for most scenario's, but it probably does not update the cached file when an older video that was previously private is made public. This 'new' video might not show up in the first x amount of results, because the results are ordered by the upload date.

An alternative to implement this is by caching the eTag field you get from a response. This can be done by storing the eTag in a .txt or .json file. The next time you send a request (to retrieve all the ids of a specific channel), you can send the eTag along with the request in the header, and you will get a 304 - NOT MODIFIED response when the resources (all the videos) are the same as the last time you retrieved the resources.

I don't know if only the ids are requested, because if more than only the ids of the videos are requested there is a chance the eTag is effected by more than only the id of a video.

ForrestKnight commented 2 years ago

The problem is the max results per page is 50, so it'll only runn through those first 50 videos, but the total results for me is 243 videos. The solution is to use the nextPageToken, and run it on the next page (50 videos) until there are no more videos. I'm running into an issue where it stops at 109 videos though, instead of going through all of my 243 videos. I have some theories as to why but I'll have to test this out at a future date. I've just implemented this in case you're curious to check it out. Line 119.

OTonGitHub commented 2 years ago

interesting hiccup at 109, perhaps a token auth or expiry issue. I'll make a testing channel, run it and give it a try.