Closed simonw closed 4 years ago
When I do this I should move the client secret to a GitHub secret after resetting it on https://github.com/settings/applications/1263558 (so the key in the git history is no longer active).
Interesting challenge will be deciding if the deploy should run or not.
Interesting challenge: I need to download the existing database from somewhere (so I can see what the last etags were) but the instance is authenticated so I can't just download it directly.
I may need a custom authentication plugin that allows a special Authorization header just for this download.
Maybe hash all of the .db files: md5 *.db | md5
- and compare to the version_note?
I'm going to build a one-off plugin for the token authentication, maybe productize it later.
Make sure to use datasette-github-auth~=0.12
to get this fix: https://github.com/simonw/datasette-auth-github/issues/59
Add some plugins - vega and cluster-map and the like.
I have everything I need to write the action now.
I've added the necessary Cloud Run secrets to this repo.
Setting four more secrets:
BIGLOCAL_TOKEN
- for hitting their GraphQL APIGITHUB_CLIENT_SECRET
GITHUB_CLIENT_ID
DATASETTE_TOKEN
- the secret token we use for accessing the Datasette APII saved copies of these in a secure note in 1Password.
I also rotated the GITHUB_CLIENT_SECRET
for https://github.com/settings/applications/1263558
I had to go with GH_CLIENT_ID
because GITHUB_CLIENT_ID
is an "invalid key name".
The good news: it worked! https://biglocal.datasettes.com/
I realize I need to download ALL of the current database files, not just the biglocal.db
one.
I can loop through the paths in https://biglocal.datasettes.com/-/databases.json
and pull them one at a time.
$ cat databases.json | jq '.[].path' -r
biglocal.db
blah.db
cat databases.json | jq '.[].path' -r | while read path;
do curl -s -H "Authorization: Bearer $DATASETTE_TOKEN" https://biglocal.datasettes.com/$path -o $path
done;
This works now.
Depends on #4.