Closed exdeejay closed 2 years ago
Oh! This is super cool! Thank you for the contribution
This is a little bit outside my knowledge, I get the gist of what's going on and the purpose - to cache pages for offline viewing - but I'm not sure how I'd go about testing it without just merging it in and rolling back if it doesn't work? Were you able to test it locally, and if so, how?
Glad you like it! I figure it would be better to explain the whole process just for the heck of it.
sw-register.js
(which I put in the head of the document) registers the service worker script sw.js
. Both are static files, so as long as the web server serves both files, that's all you need to make it work.sw.js
) initially caches the 404 page and the offline page. If the file sw.js
has changed since the last time it was downloaded, it will run this stage again and pause here until all web pages for the domain are closed.CACHE_NAME
, and is now active. It won't do anything until the next refresh/navigation.'fetch'
event handler.
Any new requests now go through the service worker.Essentially, this results in any cached version being returned and then updated simultaneously in the background, so that page times seem instant after the first visit, while still being updated after the second refresh in case an update to a page is made.
I did test it locally using the hugo development server, and I debugged it with the Service Workers tab in the Application tab of the dev tools (though if you need to debug it I recommend putting print statements everywhere in the code). There's a bit of code in sw-register.js
that disables it on localhost
so that you don't have to do the whole refresh-refresh thing when developing locally, which I disabled when I was testing the service worker. It should be as simple as just pushing it to the web server, hopefully. Bear in mind it only works over HTTPS with the exception of the localhost
origin, although that shouldn't be a problem in this case.
As for rolling it back if something goes wrong: service workers are a bit tough to roll back since they're meant to be resistant to being removed or taken offline. I found this article that has some code that could just be dropped into the sw-register.js
(or any file) that would delete all service workers on the next (or rather next-next) visit to any page. Also, if anything with the cache goes wrong, you could just increment the version number in the cache name, effectively deleting any old caches.
Reference: https://web.dev/offline-cookbook/
Just tested it and it works beautifully!
Thank you for submitting a pull request!
Before I (Vega) can merge your content I just need to deal with how to license and credit your contribution.
First things first,
Not checking this box does not mean I will deny your pull request outright, it just means we may need to have a look at how likely we are to be called out on using whatever content you've used.
Pick one of these 3 Options:
1) Public Domain
[x] I agree to put my contribution into the Public Domain (or CC-0, if that doesn't apply where you are)
[x] (not required) I would like to be credited as: DJ on the OpGuides Homepage
And Done!
If you have any questions about the license or have any concerns about your PR, please don't hesitate to ask!