Open jbenet opened 7 years ago
cc @ipfs/javascript-team
The better way is using the URN notation approach rather than go thru a gateway. URN is a valid ( although i'm not sure how well supported in browsers) approach to linking. An example of a link would be:
\<a href="urn:ipfs:\<hash>">
rather than the gateway, i.e.:
\<a href="https://ipfs.io/ipfs/\<hash>">
This would require ipfs be submitted to iana a registered namespace, much like isbn is a registered urn namespace with iana, then submit the ipfs protocol to handle this namespace. see: https://www.iana.org/assignments/urn-namespaces/urn-namespaces.xhtml.
Another way would be to add a data-ipfs
attribute to a
and use that to "ipfsify" a download.
<a href="https://ipfs.io/ipfs/<hash-of-file-4>" data-ipfs>File 4</a>
Then the JS library would grab everything with data-ipfs
attribute and intercept clicks.
try to use a local ipfs daemon to fetch things (local to the browser or the OS)
Don't think we can do this without folks running the daemon with a API that doesn't accept requests from that origin, which I'm not sure is a good idea.
Otherwise I think this is a amazing idea and would make it very easy for people to use js-ipfs to download things straight in the browser. One question that comes in mind is how to handle the download UI. If we're using js-ipfs, we need to either 1) hook that into the browsers download mechanism or 2) provide UI to show the download progress, since there won't be any loading indicator or such during the download.
There is examples of serviceworker catching requests to resources ending with .jpg, testing whether the browser accepts .webp and rewrites the request using .webp (that is when you know your backend servers webp).
I could imagine something similar ipfs links
Can you link it?
I have no experience but read occasionally about it. Afaik:
http://caniuse.com/#search=service%20worker http://caniuse.com/#feat=sharedworkers https://jakearchibald.github.io/isserviceworkerready/ https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers
Robust Links (https://github.com/mementoweb/robustlinks) does something similar to what @VictorBjelkholm mentioned but annotates links with the data-versionurl
and data-versiondate
HTML attributes to direct a user to an archived source of the URI for the scenario where the href'd URI is long long available on the live Web.
As a caveat to @thisconnect's service worker idea, the initial scope dictates where the serviceWorker is applied. For example, if https://tripod.com/~someuser/mypage.html sets a service worker, the SW will not be run on anything outside of the scope of https://tripod.com/~someuser/. If a SW is set on https://tripod.com, it will also be run on https://tripod.com/~someuser/. SWs also run only on https* per https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API.
* localhost is also allowed for SWs but not useful here. We use a localhost SW in https://github.com/oduwsdl/ipwb to get around the https requirement.
Any updates on this?
We've often discussed how to make it easy to download things with IPFS, particularly in the "almost all http" setting (eg a website served over http but that can serve js to a browser). Today I saw this archive: https://mcarchive.net/mods/ee which has "IPFS Download" buttons and figured that:
(1) it would be great if clicking that link would:
https://ipfs.io/ipfs/...
link in case the browser does not run js (and works with curl/wget archiving).(2) we had a small library that did all of the above for users, so that users only had to include a js file and it would automagically try its best to download "ipfs links" with IPFS.
https?://ipfs.io/ip[fn]s/
ordweb://...
oripfs://...
orfs://
... etc. (we should make a short spec for this...)