Closed johndpope closed 7 years ago
While I think that IPFS is an incredibly interesting idea, I don't think that this is something that makes a lot of sense for pip itself. As I understand it, it would require having some sort of persistent pip process to handle the p2p nature of it (since without a process, someone can't be a peer to upload to others) but pip doesn't run as a persistent process, it only runs temporarily.
consider that the running pip would fetch from an ipfs link which would be same as a url link with associated file hash.
Just for reference, I am starting an initiative to bring IPFS to Python packaging ecosystem. Any feedback or help are welcomed!
The initial proposal is here: https://github.com/AuHau/dpip/issues/1
Just out of curiosity, what do you think about directly depends on pip
as a package and extending its capability versus calling pip
as an external dependency (basically spinning a subprocess)?
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
did anyone ever consider using ipfs.io (interplanetary file system) as the backbone of pip?
ipfs has the capabilities of git but adds p2p in the mix - so it sort of swarms the availability of packages. https://ipfs.io/
It's would enable
it would probably make more sense for docker to integrate this under the hood as that would nip the problem in the bud. but a local pip based install which could be enabled would revolutionize system setups. https://github.com/ipfs/ipfs
it would probably resemble transmission ui as the host/ computer would become a seeder. Needs planning / thinking through.