PixelogicDev / gruveebackend

This is the Backend work for Gruvee
MIT License
24 stars 4 forks source link

[Enhancement] Create Deploy Script for Functions #45

Closed adilanchian closed 4 years ago

adilanchian commented 4 years ago

Deploying functions manually blows... There are over 10 and the more we add, the more difficult it will become. I think create a python script like so could be really useful:

https://stackoverflow.com/questions/54237969/deploy-multiple-functions-to-google-cloud-functions-in-the-terminal

brettski commented 4 years ago

As mentioned on the stream, thoughts around using GitHub Actions. I guess what will be helpful is to see your current deploy requirements, e.g. you mentioned tagging.

The THAT Conference open source projects are all using Actions now and many run out of gcp. We used to deploy through Firebase, though now all goes through GCP. I am still a little confused on/if there are differences there.

An example for using Actions on a multifunction project: https://github.com/ThatConference/that-api-functions/tree/master/.github/workflows. A yaml file is used on PR, to run linting, tests, etc. It could push to staging at this point if required. On push to master, runs tests, linting again and deploys to prod. Note this is also triggered per function, so only function paths which have changed are run.

Please define your deployment requirements and we can see well it will fit into an Actions workflow.

adilanchian commented 4 years ago

This is a good call out here. Let me describe the flow we go through for deployment:

  1. Make changes to some function
  2. Rev the .version file
  3. Create a PR to merge into master
  4. Once merged into master, we need to tag each new rev'ed function with the proper version number
  5. Push tags to origin master
  6. Run deployment via cmd (all functions have a .deployment file for the exact command)

So I think we should be good to go in this sort of scenario for Actions? Let me know your thoughts

brettski commented 4 years ago

What I have started:

  1. trigger on merge to master
  2. checkout out code
  3. grab version from .version file (this took a lot of fiddling from bash 🙂 )
  4. write a new tag based on version on HEAD
  5. commit tags
  6. deploy to gcp

I stopped at step # 3 tonight.

We'll need a gcp service account setup which has permissions to push (CRUD) to gcp functions. If you're unfamiliar we can walk through together so we have least privileges on the account. I think all it needs is "Cloud Build Editor" and "Cloud Functions Developer" though I need to be double check.

Once this user is created it needs to be added to GitHub Secrets as GCLOUD_AUTH And another variable for the gcp project id as GCP_PROJECT_ID

If you don't want to use GITHUB Secrets there is GCP secrets which I can explain a workflow if you prefer this route.

We'll need to figure out what to do with, ../../internal/config.yaml, as well. One solution is to base64 encode the file and put in a secret. the secret can be written out when used, which is then destroyed when deploy is over. As long as we're not creating a downloadable package this will be safe, and none of values will show in logs.

brettski commented 4 years ago

@adilanchian I see a potential issue with creating a tag on merge to master. Since this is a mono-repo for all of the functions (which is fine) if two or more functions are updated in that commit, there will be multiple tags on the commit for each of the update functions. Will this create an issue with Go's modding system?

adilanchian commented 4 years ago

@brettski

Hey good call out here! Let me explain a bit about how the current flow works:

Since this is a mono repo and sometimes we do update multiple functions at a time, I do currently tag the same commit with multiple tags.

This is something I have been battling with for a bit, given the nature of wanting a monorepo, Firebase functions, and Golang haha.

So when I update Function A and Function B they are tagged the following way in master:

cmd/functionA/v1.0.0-beta.1 cmd/functionB/v1.0.0-beta.3

Definitely open to ideas/thoughts around strutting and such. This was what I came up with being my first time working with Golang and Firebase functions!

brettski commented 4 years ago

Based on how this is being structured a tag will be created, based on the .version file, for each function folder which contains changes in the commit. So we should be good there.

brettski commented 4 years ago

Good progress tonight. The POC file I have been working on can be read here: https://github.com/brettski/gruveebackend/blob/master/.github/workflows/appleauth_pushMaster.yml

This file is for the appleauth function only. There will be a file for each function. This separation keeps things clean and allows multiple tags per commit based on updated function(s) in commit. This deploy file is pretty easy to copy for other functions, there is a name update in only three places (line 1, line 8, and line 17)

Secrets needed (set in GitHub secrets):

My next step with this is testing the actual deployments into GCP which I'll do against my account. I ran out of steam tonight. There are a few chances which will be made when compete to point to the PixelogicDev repo with cURL.

If you don't squash your commits when merging a PR this can cause some double tagging of commits if there is a function change in two different commits in the same PR. I am assuming this may cause some issues or not tag the commits as you expect.

Oh and we should post to Discord on deploy too.

adilanchian commented 4 years ago

@brettski this is freaking awesome. I'm really excited for this!

So we need to add these secrets in Github secrets in order to setup? Also - do we have to create an action for every function that we create or this will be generic enough to not have to worry about that?

Really appreciate all your work on this. Great contribution!

brettski commented 4 years ago

The GitHub secrets will be required with this stuff goes live. It will be a few days before I am ready, I still want to do some more verifications.

I haven't looked through the functions in detail, but from what I have seen it can be copied to a new file and the three line items updated (line 1, line 8, and line 17) for the function path. That's the goal anyway :)

adilanchian commented 4 years ago

Cool ya no worries there! I can get that setup.

I was wondering with these changes you could also update the README with the flow of how this will work. Want to make sure we keep that documented as well.

This is going to rock

brettski commented 4 years ago

@adilanchian Here is how GitHub actions redacts secrets in the logs. I explicitly cat the file to stdout and all that came back is *** values. Makes it pretty difficult to leak secrets. There are no build artifacts, etc. being saved, so there are nothing available to download as well. image

adilanchian commented 4 years ago

Awesome looks great man! Keep it up \m/

brettski commented 4 years ago

In the home stretch here. Little clean up and multi commit tests and I'll submit my PR (probably over the weekend).

One thought I had is perhaps we should roll this out on a few of the functions not all of them. What are your thoughts on this? If you agree, provide me with a few functions you want included in the initial PR.

One additional question, where in the README would you like the function tagging and deployment details? New section?

Brett

adilanchian commented 4 years ago

TODO: Add secret keys in Github before closing :)

adilanchian commented 4 years ago

@all-contributors please add @brettski for code and documentation

allcontributors[bot] commented 4 years ago

@adilanchian

I've put up a pull request to add @brettski! :tada:

adilanchian commented 4 years ago

FINALLY added the secrets :)

Closing this out. Thanks again!