Open mx5kevin opened 4 years ago
there are a few of these in places, but they are few and far between,
It would be harder to create a central/default proxy for dynamic content, but this is something similar:
https://go.zeronet.io/#Site:1HeLLo4uzjaLetFx6NH3PMwFP3qbRbTf3D
If anyone have more working proxies it can be added here: https://github.com/HelloZeroNet/zeronet.io/blob/master/go.html#L42
Is your feature request related to a problem? Please describe.**
IPFS or Tor have a Tor2Web project.These projects allow pages running on a closed network to be available on the open web too.And the open web search engines Google,Yahoo,DuckDuckGo,etc index the content of the page.It would be important for open web crawlers to index the content of pages.Users should be encouraged when visit zites without zeronet software example a video,document etc and if the content is not available encourage them to download the ZeroNet software to get it later when the user or user or users who have the file goes online.
This is what an IPFS page looks like
https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXWo6uco/wiki/
This is a Tor onion site looks like in clearnet
https://xmh57jrzrnw6insl.onion.ws/
Describe the solution you'd like
IPFS uses a working solution the point is that there is a central link https : //ipfs.io/ipfs/hash> and all IPFS pages can be accessed from the open net using the link.If a page is running from large files, everything works on it.Previously indexed content is available for up to 24 hours even if the server on which the specific page runs is offline.The pages of the major search engines Google,Yahoo are index. What is not available can be added to the client and thus becomes available later. There is a similar option on the Tor network using the https: //.onion.to, onion.cab, onion.something extension.There used to be similar solutions for ZeroNet,but on different links https: //wibsite1.something /hash>, https: //wibsite2.something /hash> etc.And these servers shut down in a row often not available. Need a central link where all pages would be available.These could be found the simlpe clearnet serch egnies and browsers.A working solution would be important because people are interested in the sites available on the open internet.Zeronet.link it is good to recommend downloading the software but nothing in the content of the pages is indexed by the normal internet search engines.Need a solution where if the content is available then the content is available on a central link on the open web.If not available such as a large file or other should be offered to download the 0Net software to get the user it later when it becomes available.At this time,no content running on the 0net can be accessed from the open web.
Problems with the current system
https: //wibsite1.something /hash>, https: //wibsite2.something hash> often not available, there is no stable central link on which web pages can be indexed uniformly.
Currently,none of these links work out of many.
They cannot retrieve larger pages or files.
The HTML not appears correctly like title,description in the clearnet search engines.
Describe alternatives you've considered
IPFS,Tor2Web project.
https://www.tor2web.org/
Or see the IPFS decentralized Wikipedia
https://blog.ipfs.io/24-uncensorable-wikipedia/