Closed tomlinton closed 5 years ago
@mouazq Would you be interested in working on this issue ? It would be pretty impactful since it will allow us to pin (e.g. "replicate") content that gets uploaded to our IPFS cluster into a separate 3rd party cluster that the Protocol Labs team made available for us.... Let us know :)
Hello, new here, would love to contribute and work on this issue :)
@tomlinton just a few questions.
You mentioned "any" IPFS hashes. So, it is possible to have empty media arrays? If the url does exist, will it always be a ipfs location or can it be of other format? I saw references to allowing data
and dweb
urls as well.
For the script it self, will the HTTP endpoint of the cluster (e.g 127.0.0.1:9094) be an input to the script (e.g as an environment variable)? If not, is there a default endpoint to be assumed? Just realized ProtocolLabs provides a url, will access it through an env variable.
We currently allow data urls in user profiles, but in listings, the media urls should only be IPFS.
Also on the URL front, it uses HTTP basic auth so there should be env vars for username/password. Thanks for your work on this, looking great so far!
@rohitverma007
Did you work on any part yet ? we can pair program it if you want ?
@mouazq Yes, I did get started on it, although have been delayed a bit due to holidays. I hope to finish it off or get majority of it done in the next few days.
So far, I added a new file under utils folder in the deployment folder, this service connects to the ipfs cluster api and added the "pinning" functionality so far, https://github.com/OriginProtocol/origin/commit/5adab2c51ccd2b9e3abe046662916d0576b6d196 Currently getting familiar with google cloud functions and pub/sub and getting started on that.
Hi all,
Just an update, almost done the feature.
Here's the latest code: https://github.com/OriginProtocol/origin/compare/rohit/ipfs-cluster-api
I made an assumption about the data sent by the publisher. I assumed the publisher will simply pass on the raw listing json to the function, and only one listing per publish. Following shows the data that will be sent to the topic: https://github.com/OriginProtocol/origin/blob/af1d3339bab59aaaf028e791ace5e424c4034bb6/deployment/utils/gcf-pin-ipfs-hash/publisher.js#L11-L12
Let me know if that's incorrect so I can modify it accordingly. Also, let me know if there's anything else missing or anything else that doesn't look right.
Hi @rohitverma007 - Thanks for your work on this ! :)
Also, feel free to start a PR so that it is easier to comment on the code ? Thanks again...
Thanks for the feedback @franckc .
Updated the incoming data format https://github.com/OriginProtocol/origin/blob/5fb19090e684c6f503b8ebd2e55c13377163d537/deployment/utils/gcf-pin-ipfs-hash/index.js#L9-L18 as an attempt to generalize it more, hopefully it covers all cases. The listing/offer hash can be placed in the ipfsHash
and the listing/offer json can be placed in the rawData
.
Created the pr #1215, feel free to comment there.
Done #1215, thanks @rohitverma007!
We need a script that can process a listing json and extract any IPFS hashes, and then use a REST API provided by
ipfs-cluster
to pin the hashes. The documentation for aipfs-cluster
API client written in Go is available here but we will need a JavaScript implementation that supports thepin
andunpin
methods. Some API documentation foripfs-cluster
is available here.The script will run as a Google Cloud Function and be passed JSON data with IPFS hashes from the
event-listener
via Google Cloud Pub/Sub. We have an example of a similar function here which does Discord announcements from Google Cloud Build events. This might help someone who wants to take this on.Related discussion #912.