oleeskild / digitalgarden

286 stars 157 forks source link

Overcoming Vercel daily 100 deployment limit #170

Open ransurf opened 1 year ago

ransurf commented 1 year ago

I still have yet to run it once, will leave another comment once I do edit: I've gotten it to work consistently for a while without any issues, updated version in comments

I just wanted to share a workaround I'm trying out using GitHub Actions to prevent auto-deployment until midnight at the end of each day since the one commit per file workflow has led me to go over the limit once.

I created a new "master" branch that I set as the deployment branch in Vercel settings, and then disabled auto-deployment with the following configuration:

vercel.json

{
    "git": {
      "deploymentEnabled": {
        "main": false
      }
    },
    "outputDirectory": "dist",
    "installCommand": "npm install",
    "buildCommand": "npm run build",
    "devCommand": "npm run start",
    "routes": [
        { "handle": "filesystem" },
        { "src": "/(.*)", "status": 404, "dest": "/404" }
      ]
}

Here is the github workflows script I'm using as well.

.github/workflows/auto_merge.yml

name: 'Daily auto-merge'

on:
  schedule:
    # * is a special character in YAML so you have to quote this string
    - cron:  '0 0 * * *'

jobs:
  merge:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout
        uses: actions/checkout@v2
        with:
          fetch-depth: 0

      - name: Setup Git
        run: |
          git config --global user.name 'github-actions'
          git config --global user.email 'github-actions@github.com'

      - name: Merge and push
        run: |
          git fetch origin master:master
          git checkout master
          git merge main --no-ff -m "Merge main to master"
          git push origin master
adrianghnguyen commented 1 year ago

@ransurf Have you had success with this method yet?

ransurf commented 1 year ago

Yes, I had to update the gist and add a new environment variable in vercel for MY_GITHUB_TOKEN

Here is the new gist, be sure to change the variables surrounded by ** on the last line to your own

name: 'Daily auto-merge'

on:
  schedule:
    # * is a special character in YAML so you have to quote this string
    # daily
    # - cron:  '0 0 * * *'
    # hourly at xx:50 time intervals, replace 50 with your desired minute
    - cron:  '50 * * * *'

jobs:
  merge:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout
        uses: actions/checkout@v3
        with:
          token: ${{ secrets.MY_GITHUB_TOKEN }}
          fetch-depth: 0

      - name: Setup Git
        run: |
          git config --global user.name 'john doe'
          git config --global user.email 'example_email@gmail.com'

      - name: Merge and push
        run: |
          git fetch origin master:master
          git checkout master
          git merge main --no-ff -m "Merge main to master"
          git push https://${{ secrets.MY_GITHUB_TOKEN }}@github.com/**your_github_user**/**your_repo_name**.git master
wheresalice commented 1 year ago

I really like this, but isn't the vercel.json file going to be overwritten the next time you generate a PR to update to the latest template?

The only times I've hit this limit has been when doing some major reorganizing of my folder layouts - in which case I've published all the changes at once, and expected a single build to kick off - not hundreds. I wonder if there's a way of optimizing that as an alternative approach to this

ransurf commented 1 year ago

I really like this, but isn't the vercel.json file going to be overwritten the next time you generate a PR to update to the latest template?

The only times I've hit this limit has been when doing some major reorganizing of my folder layouts - in which case I've published all the changes at once, and expected a single build to kick off - not hundreds. I wonder if there's a way of optimizing that as an alternative approach to this

i will personally just stash the changes, and I'm not even 100% sure if it will be overwritten?

the problem is that each file gets its own commit, and each time there is an update in the branch it will perform a new build. by having your commits done on a separate branch with a controlled build every x interval as this workflow suggests, you are able to overcome it.

im assuming that changing the update methodology of the plugin to have all file updates as one commit would be quite the rework, and think this is a better alternative (since it may not even be necessary for everyone)

wheresalice commented 1 year ago

Yeah fair point - I'm just thinking about making this more user-friendly. An option in the plugin to push commits to a different branch would go a long way towards that, and doesn't sound too difficult.

But maybe users of this are already going to be comfortable enough with applying manual changes as you have

dipanjal commented 4 months ago

Just go for Manual Deployment. Lot more control and easier