Closed jdiff closed 5 years ago
If we decide that it would be more efficient to have a system that edits the WordPress page directly rather than have the Wordpress page hit an API to retrieve the code, the workflow above would still be applicable for determining whether to embed one or two streams in the page.
I support having WordPress pull the embed codes from a JSON API. (We might want/need to FastPass that endpoint so it doesn't get slammed.)
If we want to FastPass this, what would the cache invalidation interval that would work best in this case? Also, remember that creating an endpoint and updating the page aren't necessarily mutually exclusive if the WP-JSON plugin won't be sufficient.
If we want to FastPass this, what would the cache invalidation interval that would work best in this case?
Every 15 minutes makes sense to me. We usually start shows on the 15 minute mark, so I imagine that would work well.
Also, remember that creating an endpoint and updating the page aren't necessarily mutually exclusive if the WP-JSON plugin won't be sufficient.
I cannot see a scenario where we should need to do it this way, but you are correct in your statement. I think it would be bad practice to have two things edit the same page from different directions though - it could easily get confusing when trying to debug or fix something.
I don't see why the JSON plugin wouldn't work. We're only grabbing a few lines of code from the API.
I've also been thinking about the workflow above and I think I've found a way to simplify it. I'm gonna sketch it out and then post it here.
Here's the general concept: we'll query the YouTube API every 5 minutes to retrieve all scheduled streams, including future, current, and past. The information will be stored in a database according to the start time of the stream. The WP-JSON on the NTunes page will hit our API whenever a user loads the page. Our API will return the following data (in this order):
So we could have up to 3 embedded videos on the page at any one time (but in all cases we'd have at least 1). I think this workflow simplifies the logic that we have to incorporate into the API.
Based on discussion with Ben, I've tweaked the workflow to better accommodate WIGS postshows and the tots marathon show. Below is the tweaked workflow.
Our API will return the following data (in this order):
The API will have two separate endpoints - one for the public NTunes page, and one for the WIGS-Exclusive page. For the public page, the data returned by the API will be only public streams. For the WIGS page, the data returned by the API will be all public PLUS unlisted streams.
This way WIGS members don't have to flip back and forth between pages.
How does the automation know which videos are WIGS only?
They're unlisted.
If they're unlisted, then we need to make sure this account has the appropriate channel admin privileges, which also means we can't use the same API key as the app.
That’s not a problem. Turp and I have already got that part working
@nturpchinoff Where's that code(/authentication)? I checked Rob's DisneyParksAPI repo and don't see it there or in the organization's repos.
I can get you the auth key. Turp figured out what YouTube API to call and what query to send.
Yeah, we're going to need that auth key. The call and query would be nice to help us save time and effort.
Alternatively, @nturpchinoff should assign this to himself, and commit what he's done once he's happy with it :)
For populating the NTunes pages, the concept is to create an API endpoint that returns JSON code for the show(s) to be embedded into the WordPress pages.
Then the WP-JSON plugin will query this API whenever a user loads the page, in order to embed the correct videos on the page.
There will be two endpoints, one for the public NTunes page, and one for the WIGS private page. They will work the same way, the only difference is that one will return public streams, and the other will return unlisted streams.
The API endpoints will work like this: