Closed konklone closed 10 years ago
This is at work in the dat branch, forked from @keith5000's and @errolgrannum's original work at keith5000/USGlossary.
Whoa, the dat branch? Tell me more!
It's an homage, rather than a literal reference. :) The hook server is filling the role of a blind data transformation pipe (even though the implementation is actually a bit intricate), it will make it so that every contribution of prose automatically gets published as JSON.
So far we have a tiny web server, and a tiny work-in-progress Node wrapper around the Github CRUD API.
I'll almost certainly spin that github.js
off as its own little library, because none of the Github API wrappers for Node I could find actually supported the new CRUD API! So here I am doing it myself.
Ah, OK, that makes a lot more sense. :) Clever setup!
This is also kind of a fun exercise in development - I'm testing Github post-receive hooks to my laptop, which means I'm using @shtylman's localtunnel (coincidentally also written in Node) to make my laptop available on the public Internet for POSTing to -
$ lt --port 5000
your url is: http://XXXX.localtunnel.me
Where 5000
is the port I'm running the Node app on in development.
Then I add the URL it gives me as a web hook in Github's settings (using a fork at konklone/glossary
for testing), and I can easily muck around.
Whoa—that's pretty interesting. I can see how that'd be a pretty great way to test.
Oh, so I did this weeks ago. Closing.
Have a script that listens on an arbitrary port and can receive a Github POST from a hook. The hook should sweep through each valid glossary file and generate a CSV with the fields:
And then push this to the
gh-pages
branch, following the same directory structure as themaster
branch uses for prose.In the future, there could be fields for related definitions as expressed through a special "See also" paragraph, or a tags paragraph, but this is fine for now.