User tags a site (domain?) to save offline. Every time the user visits that site the page all assets that are fetched will be saved for offline viewing.
User can view list of sites saved offline and click to view a site.
Super Bonus: User can view previous versions of a site. Kinda like your own personal offline wayback machine!
For example, a user tags wikipedia.org for offline viewing. Then all site visited can be saved for offline viewing. (not sure if it'd be single URL -> dat or single domain -> dat).
Saving the site to a Dat will make it so you can use the existing Dat viewer to view site (I think you can use local assets in that viewer too?).
I did a prototype a few months ago with web-to-dat, minus the whole Beaker part. If you can get a list of URLs then you might be able to use url-dat to put them into an archive.
It may be useful to share sites with peers. But it'd seems like you'd need a dns type thing for mapping dat links to web urls (or manual sharing). I can't quite see this being really good until we do multi-writer hyperdrives (so both peers could update the offline site).
For example, a user tags wikipedia.org for offline viewing. Then all site visited can be saved for offline viewing. (not sure if it'd be single URL -> dat or single domain -> dat).
Saving the site to a Dat will make it so you can use the existing Dat viewer to view site (I think you can use local assets in that viewer too?).
I did a prototype a few months ago with web-to-dat, minus the whole Beaker part. If you can get a list of URLs then you might be able to use url-dat to put them into an archive.
It may be useful to share sites with peers. But it'd seems like you'd need a dns type thing for mapping dat links to web urls (or manual sharing). I can't quite see this being really good until we do multi-writer hyperdrives (so both peers could update the offline site).