Closed philwareham closed 4 years ago
I'm presuming 1 and 4 are on-demand, in which case network latency/load might be an issue. 4 may also be subject to usage limits. 2 and 3 I expect will result in a delay between publish and appearanceon our site so the frequency will need managing.
I'd go for a cron job but with the ability for the author to "force" if we possibly can, without risk of bots doing it. But on demand does have merit if we can rate limit it somehow.
Yes. 1 and 4 rely on a live link to GitHub. Usage limits on the GitHub API are very generous so I doubt we’d ever reach that. But there is a reliance on a connection and probably some (marginal) performance drawbacks. Pros of these methods are in instant updates when repo changes occur. Cons are aforementioned and also method 1 is a little dirty I feel.
Methods 2 and 3 keep all the data on our server so the reliance on GitHub is reduced. If @petecooper can give sone feedback I think method 3 is our best candidate. I don’t particularly want to install Node on the server just for this (unless of course it’s already there).
Something that would minimise network traffic would be some kind of incremental/random build number or reference at the GitHub end, which could be compared with our server local copy, and if they differ then the downloading and rebuilding could kick off. That way we don't trip the limiter and we're suitably well-behaved tenants.
I'm hoping to be able to split our sites across two servers when I have some time: one with mostly static files (self-hosted docs, .com, design patterns, etc) with appropriate caching, and the other for more dynamic stuff (forum, chat, conferencing stuff).
Quick test:
$ curl -i https://api.github.com/users/petecooper
[…]
X-Ratelimit-Limit: 60
X-Ratelimit-Remaining: 58
X-Ratelimit-Reset: 1592675401
So, back of napkin: one request per minute, every minute, and we'd be redlining. Docs rebuild every 15 minutes (perhaps even less), and plugins JSON every 10 minutes (more or less?) would be 11 requests an hour.
Whether or not the GitHub stars thingy on .com counts towards the limit, I dunno, but we can factor that in if we need to.
Ten minutes sounds more than reasonable. By the time you've published the code on GitHub and written the forum post, the repo will likely know about it.
Are we standardising on GitHub, btw, or could there be scope for fetching stuff from a private site as long as it returned a well-formed JSON blob in response to a request our server makes via cron?
EDIT: Or is it the idea that if you want your plugins to be discoverable, you need to register them/your private repo endpoint with the curated-plugins-list repo?
@Bloke - see: https://github.com/textpattern/textpattern-curated-plugins-list/blob/master/template.json#L5
As I understand it, that's GitLab/Gitea/etc-friendly.
To be clear, the library cards (this repo) is on GitHub. Plugins themselves can be from anywhere, in our design.
The GitHub API has a rate limit of 12,500 an hour so I think we will be safe (even adding in the other stuff we use the API for - such as GitHub star count).
Right, cool. So if I write a new plugin, I have to register it with the library card repo on GitHub to have it included in the repo website? The endpoint of that plugin can be anywhere, doesn't have to be GitHub, as long as I tell the library card system where the file(s) are located inside the JSON construct.
The GitHub API has a rate limit of 12,500 an hour
That's GitHub Apps, isn't it? Or have I got my wires crossed? Unauthenticated is 60/hour, OAuth is apparently 5,000/hour. If I can pinch the auth details then we're flying along!
Right, cool. So if I write a new plugin, I have to register it with the library card repo on GitHub to have it included in the repo website? The endpoint of that plugin can be anywhere, doesn't have to be GitHub, as long as I tell the library card system where the file(s) are located inside the JSON construct.
Yes, that is correct.
The GitHub API has a rate limit of 12,500 an hour
That's GitHub Apps, isn't it? Or have I got my wires crossed? Unauthenticated is 60/hour, OAuth is apparently 5,000/hour. If I can pinch the auth details then we're flying along!
Oh yes, you are correct: 5,000 per hour. I am planning on having etc_cache around any GitHub API calls (which I used to have on the GitHub stars thingy on Textpattern.com) which I think will reduce the calls. That doesn't include this repo syncing to our server though.
Aside: we can also curl
with checks for Last Modified, which should throw a 304 if it's the same, and they don't count towards the total.
Hi @petecooper - hope you are well. Can we please try a curl
of this repo to somewhere in the root of plugins.textpattern.com? I'm thinking of /curated-plugins-list
or a similar name. I can then amend my plugin site testing to use that directory instead of linking directly to GitHub every time a page is accessed.
I think the overarching future plan is also to use this local copy of the /curated-plugins-list
within Textpattern core to check for/notify when a plugin is updated too.
I don't know what everybody's feeling is for how often we curl
the GitHub repo, but I don't feel it needs to be too often (once an hour possibly). I have an access token for the API which allows for a certain level of requests per hour, which would need to cover a few different tasks, so I'd prefer not to eat into that for this task.
If an intuitive way can be found to force an immediate curl
if needed (say a plugin is totally broken and we want to push a new version quickly) then that would be a nice to have, but not essential. Not sure if GitHub Actions can help with this, or a URL webhook containing private key on our server maybe. Not something I know about, sorry.
Yep, I'll take a look at that now. Stand by.
Ace! Once an hour is ample, perhaps with an authenticated way to pull in an emergency.
Once we have a working endpoint I'd love to test out the update from the Plugins panel. Let me know when the dust has settled enough and there are a few plugins represented so I can start to build in an admin-side check. Exciting.
OK, first pass done - twice an hour (quarter past, quarter to) schedule, build.sh
can be run from a shell to force get, splat and repave.
Note there's no index page in the repo, so please advise if you want a specific thing to appear if someone hits the URL stub directly (right now it 403s).
Cheers Pete, just looking at it, we probably only need this structure from root (sorry, my bad!):
/library-of-plugins
abc_example1
abc_example2
Basically, none of the other info in that repo is of use here - so the cleaner the better.
I have no opinion either way on the index file; I think 403 is fine personally, but holler if you have a different opinion.
So just the flat JSON files from inside the /library-of-plugins
dir?
Edit: assuming yes, check again after quarter past the hour and ~all of your dreams will come true~ that should do it.
Yes please. Will check in a few minutes, as you suggest. I'll then add one more plugin and check again 30 mins after that. If all working I think we can close this particular issue.
Sorry, can we change the target direct name to just library-of-plugins
too. I think that works better for a future endpoint.
A quick question on implementation: how do we know the version of Txp that 'stable' refers to? Let's say:
How does it know whether to offer it for Upgrade or not? In short, how do we know what version of Txp it was written for? Can we assume that if there's a new version available that it works for your installed version of Txp? What we don't want to do is let people click it, verify it (without reading the code), install it and then find it kills your site/admin side because it's written for Txp 4.7.3 and has not been updated for 4.8.
Any thoughts on this? Just because something has a higher value than before doesn't mean we should offer it for upgrade if it's not compatible.
Any thoughts on this?
This has been playing on my mind last few days. On the actual Textpattern site part of this, you can set (via custom fields) the minimum/maximum Textpattern version a plugin can work with. That is mainly for search and discovery on the website though. If min/max are left blank fields then there is a an 'unverified' pill that displays next to the Textpattern version(s).
However, that doesn't help here directly. I'm open to suggestions on how we do this in a simple and easily maintained fashion (I'm not entirely convinced my solution on the website is the best way either, but I've not thought of anything better as yet).
I think we just hit the GitHub limit, please stand by.
I think we just hit the GitHub limit, please stand by.
I think actually the your GitHub repo name is wrong now, the repo is still called textpattern-curated-plugins-list
- the folder we want from it (and name on our server root) is library-of-plugins
.
Sorry for the confusion.
Yep - mea culpa. Sorry. Fixed.
Cheers! Not sure if you've built this fail-safe in too, but we need to ensure that if the GitHub repo is unavailable at that point in time to clone - that the live version on our server is not wiped away.
Otherwise the plugins site will fall over as it needs actual JSON files available to work. Sorry again if this is more work!
Step ahead of you - already working on that. This is just a stopgap for your testing.
Min/max is an excellent way of doing that, and if we/someone can keep this up-to-date, fabulous. And if it's not there, the Unverified pill (a.k.a. "There be dragons") is the best we can do, and is good enough.
Let's say we build in the ability for an author to decree a plugin works for "4.7+" directly in their JSON file. There's still no guarantee it'll work in future versions. This was the rationale behind my "fuzzy feeling" system when we first looked at revamping the plugins site years ago. I kinda had vague thoughts of some ability for us/the community/someone to stick their neck out and say "I installed it on Txp x.y.z and..."
a) It works fine. b) It works if you alter this, this and this. c) It doesn't work.
At least then, we can give people an indication of whether it might work and the steps they need to go through to get it to work if they desperately need it.
Going further still, it might be that the plugin is still available but its functionality can be achieved by core. Or there's a newer plugin forked by someone else that does it. So if that's the case, having some way of letting people know that yes there's a new version available (or no there isn't) but they don't really need the plugin any more because if they visit such-and-such a page (in docs, or wherever) they can follow that advice and get a new plugin or do some tag magic instead.
I'm thinking things like rah_beacon and smd_macro which people may have installed and are now obsolete due to shortcodes. Having some way of marking these plugins as such would be a fabulous addition to this system so we're not only advising people that an upgrade to a plugin is available (or not) but that, under some circumstances, they don't even need that plugin any more and we can direct them to the shortcodes doc page. How cool would that be?!
I mean, if the reason someone's not upgrading Txp is because they have a plugin that isn't going to be compatible, it's pretty vital information to tell them that the plugin they're clinging to isn't even needed in the version of Txp they could download.
How about this then: When something changes with the plugin, could we have an optional node in the JSON file that contains a link to an explanatory doc - release notes, for example. Then we can either add a '?' alongside the upgrade link and bring that doc into a pophelp, or perhaps make the update link actually fetch that and display it as the first page of the "upgrade your plugin" wizard.
That doc would be an ideal place to state any caveats, limitations, version conflicts, plus new features people get from the plugin if they proceed with the upgrade. Is that something we create in one of the article fields? Or a link in a custom field to 'latest explanatory notes' (or "notes for version a.b.c") that we can feed into the JSON and can then offer the contents of that link to people in some way from the admin side (as long as it's not dead link!).
Dunno. Don't want to make extra work for us or authors. But it would be nice somehow if we could have a way to collect info on the compatibility or otherwise of plugins and offer some supplementary information, even if it's something that member(s) of the community supply and can be updated in the repo somehow.
@bloke but min/max version is only currently available on the website itself though, not directly in these JSON files. The only way we could expose that to a system that checks against versions and compatibility would be (as far as I can see):
Read the JSON-LD data from the head of a plugins’ Textpattern-powered plugin site web page, instead of the plain JSON files from this repo. And/or,
Stating the compatibility in the JSON file instead. Which (maybe) leads to...
Somehow inject the JSON file compatibility entry into the database (otherwise we are doubling up on our work to state the compatibility). Not sure that's even possible or desirable.
Bearing in mind this is a curated list of plugins - I think we can follow this kind of process:
The JSON file has an additional entry in the 'stable' and 'beta' nodes where you state max-txp-version-compatibility
The website has custom fields for min-txp-version
and max-txp-version
(which it does now):
If both min-txp-version
and max-txp-version
are left blank in custom fields - the system assumes the plugin is useable on all Textpattern versions from 4.5 (the lowest we will support) up to latest version - but with 'unverified' pills on them. i.e. uncurated/unverified yet apart from whatever is stated in the JSON file.
If one or other or both of min-txp-version
and max-txp-version
are filled in in custom fields - that plugin is verified as working in that range of versions. i.e. curated/verified.
The max-txp-version-compatibility
may be set higher than max-txp-version
, in which case the onus is on the website maintainers to curate/verify that plugin and fill out custom field accordingly.
I realise that is more work that I'd like there to be, but I am having trouble thinking of a way of letting the JSON file alone handle the versioning but also allowing people to search the website database for plugins compatible with their version of Textpattern.
Thoughts, anyone?
Going further still, it might be that the plugin is still available but its functionality can be achieved by core.
I was planning to have a switch - or a shortcode on the website in the <txp:body />
- where we would state that the plugin is superseded by core Textpattern features. But, again, this wouldn't be exposed to a plugin update system unless we were using the JSON-LD method. Maybe we have another entry in the JSON file for this instead?
Also the body area would be where we store any further notes needed about any caveats, limitations, version conflicts, etc. This is because we cannot rely on authors' sites being live and reachable for the rest of all time. This also puts more work on us though.
Since all updates to the JSON cards would be either by our team of maintainers, or via pull-requests, this limits the risk of updates falling through the net (hopefully).
That sounds perfectly reasonable. And not outside the realms of possibility to fashion a tiny page that gives us (admins) an overview of any plugins that have their author-certified max version string higher than the one in the custom field for that plugin. A kind of mini status report, if you will. We could review this from time-to-time and either verify it ourselves or throw it out to tender in the forum for people's reactions to "does abc_plugin x.y.z work in Txp version e.f.g?" and we can update the custom fields accordingly.
So, this throws up a couple of questions:
max-txp-version-compatibility
to the JSON template now?I was planning to have a switch - or a shortcode on the website in the
<txp:body />
- where we would state that the plugin is superseded by core Textpattern features.
Ace!
this wouldn't be exposed to a plugin update system... Since all updates to the JSON cards would be either by our team of maintainers, or via pull-requests...
Right, but herein lies some magic. Because it is a curated list and because we do control the environment, we can pretty much guarantee that the JSON files we expose WILL HAVE a corresponding article in the repo under our control.
Our curated JSON files are built by us, from a lump of PHP, right? If not today, they could be. That gives us the ability to inject additional select data from fields in our plugin repo into the JSON file for other people to then consume from their Plugins panels. In essence, we:
Or something along those lines.
EDIT: great minds :)
Looks like this is a plan! We do a simple version of a Textpattern-served JSON file on Textpattern.com to state the latest release, which is then used by the core version checking feature - so I know it's possible.
Phew! I think I'm finally getting somewhere. This has been a bit more complex than I'd originally planned for the site - but think it will lead to some exciting possibilities.
Another question 🤪:
When we state max-txp-version-compatibility
- do we just state major versions (such as 4.8
) or actual semver (such as 4.8.1
). What I mean is do plugins break often in minor point release updates?
If we go for semver then that potentially gives us yet more work to maintain.
I think we're safe to just go for major.minor
. If we discover a bug, we fix it in a point release, so we can pretty much state that '4.8 compatiblility' is "any version in that series" as most people will reach the highest version in that series at some point - especially if they want the functionality in the plugin.
@petecooper Hi! Sorry, I'm not seeing the library-of-plugins
folder in root of plugins site populating at the moment. Seems to be an empty directory?
Will check - stand by.
OK, GitHub went down between 3pm and 6pm, now fixed. Script should fire at quarter past, updated to include a check to make sure README.md
exists from the cloned repo. If no README.md
, no repave.
Great! This all seems to be working as planned now. Closing this issue as discussions can continue in other issues now. Thanks all!
There are a number of ways we can deploy the data from this repo, just need to decide which is best:
Each has its own benefits and drawbacks.