ipfs / infra

Tools and systems for the IPFS community
MIT License
133 stars 41 forks source link

Deploy npm-on-ipfs to container service #423

Closed achingbrain closed 6 years ago

achingbrain commented 6 years ago

The npm-on-ipfs project has a server component that users can run locally allowing them to pull deps from IPFS, but a better user experience would be allowing them to just do:

$ npm config set registry https://registry.ipfs.io
$ npm install

In which case the server component will need deploying somewhere. Ideally we'd have a bunch of Docker images of the server deployed on some sort of container service (like ECS for example) with DNS set up to resolve registry.ipfs.io or similar and a load balancer in front of it all to do SSL termination and distribute requests to containers.

ghost commented 6 years ago

Agreed! Let's get a continuously updated NPM dataset up first though, and make sure that part is reasonably stable and fast (unless you already have other plans).

achingbrain commented 6 years ago

The server components update their datasets but don't publish anything yet as support for that isn't in js-ipfs yet. Once that's in, they'll publish their hashes.

achingbrain commented 6 years ago

In the mean time it'd be really helpful if we can get this deployed somewhere.

achingbrain commented 6 years ago

This is blocking one of my Q3 OKRs. Do we have an account with a container service somewhere? Or can I set one up?

ghost commented 6 years ago

We'll have something for you by the end of the week - there's multiple people who need hosts "right now" :) and we're figuring out how to do this sustainably

ghost commented 6 years ago

David said you need a lot of storage, but you said you only wanna run the registry container, so could you clarify what you need? Do you want to build the NPM dataset on that host and serve it from there as well? (The two don't have to be on the same machine.)

achingbrain commented 6 years ago

Each instance has a blob store that manages it's own IPFS node that stores and updates the dataset, which lives in the container by default but could live on S3. They could all talk to a remote IPFS daemon instead, but it won't scale as well.

ghost commented 6 years ago

Created a i3.2xlarge instance in AWS: 35.178.192.119 (with TCP ports 22, 80, 443, 4001 open)

In the not-too-distant future we'll have container stuff and fancy terraform templates and automation, but in the meantime this was done by hand.