Closed ghost closed 2 years ago
Can you maybe explain all those hash values and the exact process how you’ve created that list?
They're content IDs autogenerated by ipfs when you push data to the network. Anyone with the cid can access the files (https://ipfs.io/ipfs/QmZbSqT3mybJcUdkwsA1knNyrZwFcgfY9n8ftPyrk5kQzo). The ipfs daemon will mount the directory via fuse and overpass will treat the files as though they are on disk
That doesn't really answer my question. I really wanted to know which data did you push to the network here? Also, how do you expect this to be updated every day?
The list of files appears to be somewhat incomplete. I don't see any attic or meta files. Also, it's not clear at which time the data has been created and who would take care of updating it.
In general, I don't think it's a good idea to refer to any data in this repo that's being managed by a third party. It will only result in lots of maintenance headache.
Maybe this would be better served in your own repo, after all.
Yeah the data is from a recent clone of https://dev.overpass-api.de/clone/2021-11-05.
More generally, one can publish the most recent version to their node via ipns so that the cid won't change even if the underlying data gets updated daily. Hosting on ipfs would be less resource heavy on the maintainer as the files get cached on different nodes the more people access them.
Since cloning the existing database can take quite a bit, it would be cool to host a copy of the db on ipfs. Users with ipfs mounted can read directly from
/ipfs
and run queries without the need for a full transfer.