RichardLitt / 58-liters

A place to talk about what's in your bag
https://58liters.com
MIT License
4 stars 1 forks source link

Implement image storage system #30

Open RichardLitt opened 7 years ago

RichardLitt commented 7 years ago

Is a static site the best way to store shit tonnes of photos? Or is there a better way?

hackergrrl commented 7 years ago

I like to keep big image data separate from the git repo, to make clones fast even on slow connections.

Does IPFS make sense? In the future, maybe if we ran an IPFS daemon with a webhook that looked for new IPFS hashes in the repo and automatically mirrored them.

RichardLitt commented 7 years ago

Agreed. Another good reason to remote these from the .git repo.

IPFS would make sense. You'd know how to set that up better than me, I think. The only issue is pinning - but I guess we can have a backup bucket on AWS or something to make sure it doesn't go down?

hackergrrl commented 7 years ago

Hm, good point: maybe something simpler would make more sense. Any solution that's so advanced that everyone can't readily maintain it will probably fall apart. Are you running an IPFS node of your own? Maybe you and I can manually pin hashes for the time being.

(Apparently there's Git Large File Storage, but it's hardly trivial to setup either.)

RichardLitt commented 7 years ago

I'm not running a node; bit worried that only two duplicates may not work well enough.

How do normal companies do it? Isn't it just AWS? I guess that costs - is that the main issue?

hackergrrl commented 7 years ago

Yeah, tossing everything into S3 would work. I'm usually fine with having only one or two 9s of uptime for my own things.

RichardLitt commented 7 years ago

Although, Netlify might make it easier than s3. We could just put all of the images in a submodule - we already have one for the theme, why not one for images?

https://www.netlify.com/blog/2015/03/06/netlify-vs-amazon-s3/

hackergrrl commented 7 years ago

Nice: that sounds much easier than mucking with S3 directly.