Closed jeremyzilar closed 2 years ago
Ok, looks like the point of Git LFS is to make it so that the repo doesn’t contain the multiple revisions of a large file in the repo. So if we have 40k images that we needed for our site, those would be in the repo, and would need to be checked out each time there is a build. In short, Git LFS is not a CDN. https://about.gitlab.com/2017/01/30/getting-started-with-git-lfs-tutorial/
How do images work right now between GH and FED? Are they copied over into the a FED AWS bucket each time we spin up a new instance? I guess I never really thought about this!
Everything currently gets unpacked from the repo and stuffed into an S3 bucket. So I understand that to mean that ALL the images would need to be in the repo and have to be unpacked
That's a lot of files moving back and forth for each build!
🛑DEPRECATED🛑 - Repo no longer being maintained #46
We currently have around
40,500
images in DigitalGov and they will need to live somewhere.Update —
digitalgov.gov
currently has ans3
bucket through sites. We need to explore being able to use that bucket. ( @JJediny )Cloud.gov S3 + CDN https://cloud.gov/docs/services/
GIT Large File Storage https://git-lfs.github.com/How will we get our images to this CDN or S3? Probably via GULP. Though we are open to other options.
In the end, we will need a predictable path for hosting these images that we can bake into our migration scripts and we will need a path for editors to be able to add images to the site.