Closed robzienert closed 4 years ago
So I was actually thinking about doing this; the reason I hadn't yet was primarily because I like the 'portability' of a single bash script vs. having to download multiple separate files. Not sure what the 'correct' path is here, tbh.
Oh, I see. You basically wget
the all.sh
script from a repo onto an instance and run it.
An option: We could set this repo up to have a github action that on every commit creates a new Github release (doesn't need to be semver or anything, of course, could just be the SHA) that tars everything up. As far as an installation process goes, it'd just be wget
from a different URL and a tar -xzf
. WDYT?
Another option, depending on your appetite for execing from curl
... we could curl
an installer script (which would be all.sh
in this case) that then downloads everything you need in-process.
I like the idea of something self-contained (eventual goal is airgapped-ish environment, although that's an order of magnitude harder). I do kind of like the idea of the tarball; you're thinking tar extract and run script?
Yeah, wouldn't be much to do. If you're cool with that pattern, I'll open a separate PR to do this flow before this is merged?
Yeah - that sounds really cool. Let me know if you need help with this!
let's 🚢 this PR and create a new PR for the GitHub Action to create a tarball as a release. i really like that idea so you don't need git on the VM you are deploying to.
Okay, merged the target branch of this PR into master, so we'll keep the d4m
branch as a working branch (and will merge this PR).
I'm looking to make some changes to better support D4M and just wanted to get some baseline stuff in place:
all.sh
executable in git, rather than forcing end-users tochmod +x
it.all.sh
into actual files that are just copied into place.The reorganization just helps me compartmentalize what I have in my head a little better. :)