actions / deploy-pages

GitHub Action to publish artifacts to GitHub Pages for deployments
https://pages.github.com
MIT License
650 stars 80 forks source link

Deploy website to custom url route #73

Closed willeppy closed 1 year ago

willeppy commented 1 year ago

I am building a site for my repo using a github actions workflow. To deploy the website, I run the below action. How can I change the url where this site is deployed?

Specifically, I would like to add a route to the end so that instead of deploying to example.com/{repo_name} it deploys to example.com/{repo_name}/notebook. It seems to be set by ${{steps.deployment.outputs.page_url}}. I tried adding notebook to the end of this like ${{steps.deployment.outputs.page_url}}notebook with no luck.

Here is my deploy action that runs after build in the yaml:

 deploy:
    needs: build
    if: github.ref == 'refs/heads/main'
    permissions:
      pages: write
      id-token: write

    environment:
      name: github-pages
      url: ${{steps.deployment.outputs.page_url}} # How can i change this url??

    runs-on: ubuntu-latest
    steps:
      - name: Deploy to GitHub Pages
        id: deployment
        uses: actions/deploy-pages@v1

It seems this is not supported out of the box right now but would be a very useful feature in the future

khaled200286 commented 1 year ago

New

yoannchaudet commented 1 year ago

👋 Just to confirm this is not supported today. Deployments are "atomic operations" currently. We may expend in the future but there is no concrete plan at the moment.

Your best bet today would be to leverage our path based routing mechanism.

The root of your website can be tracked in a <username>.github.io repository. Anything you want to live at /<path> on that side, you just need to put in a repository named after the path, e.g. a notebook repository would publish content under <username>.github.io/notebook. If you have a custom domain set for your <username>.github.io repository, this will work accordingly.

briantist commented 1 year ago

Adding another voice to this request.

I have branch-based workflows now and want to publish partial changes: basically, I want an "atomic" update but to a specific directory, while not affecting other directories.

One way that support for this could be added without changing the publish process, is to give us a way to download the entire current published site as an archive.

That could allow us to handle this in the "build" portion of the workflow, and leave the publish process unchanged.

It might look something like this:

jobs:
  # Build job
  build:
    permissions:
      pages: read  # ???
      contents: read
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3

      - name: Pull existing site
        id: read_pages
        uses: actions/read-pages@v1  # ???

      - name: extract existing site
        run: tar -xvf "${{ steps.read_pages.outputs.archive }}" -C site/

      - name: Build process
        working-directory: site
        run: |
          # do build process to modify just what needs to be changed

    # upload artifact of whole site using actions/upload-pages-artifact

  # Deploy job (copied directly from README, no modification here)
  deploy:
    # Add a dependency to the build job
    needs: build

    # Grant GITHUB_TOKEN the permissions required to make a Pages deployment
    permissions:
      pages: write      # to deploy to Pages
      id-token: write   # to verify the deployment originates from an appropriate source

    # Deploy to the github-pages environment
    environment:
      name: github-pages
      url: ${{ steps.deployment.outputs.page_url }}

    # Specify runner + deployment step
    runs-on: ubuntu-latest
    steps:
      - name: Deploy to GitHub Pages
        id: deployment
        uses: actions/deploy-pages@v1
JamesMGreene commented 1 year ago

@briantist As Yoann mentioned, incremental deployments are not a planned feature at the moment for GitHub Pages. There was some notion of this in the past but it was a source of security vulnerabilities exploited by bad actors.

As for a recommended workaround, I might suggest having your "build" workflow operate on your default branch (e.g. main), but then push its built changes into a second branch/tag/Release to maintain the assets for the published site. Resolving any necessary merging of assets would be up to you. You could either pull the previously published assets into your build workspace before or after running the build of your latest code. 🤷🏻‍♂️

briantist commented 1 year ago

Hey @JamesMGreene , this is currently what I do, the exciting thing about this newer actions-based deployment model is not having to commit and push, which requires permissions I don't want, and commit history I don't need. Pushing the assets into a branch seems to be re-implementing the current gh-pages-based workflow.

Anyway, while I understand that incremental deployments are not a planned feature, the workaround I suggested does not require you to implement any special support for incremental deploys at all, it only requires one thing: let us download the full contents of the currently published site.

That would let us access the previously published assets and merge them as needed (or do whatever with them).

Where would be the best place to request a very simple action to download the site contents?

- name: Pull existing site
  id: read_pages
  uses: actions/read-pages@v1  # ???
tschaub commented 1 year ago

I also came here looking for a place to ask about a download-pages-artifact type action to compliment the upload-pages-artifact action. I think this is the same as @briantist's proposed solution – allowing users to decide how to merge in new content in their build step.

Is there a way to get all the content of a current pages deployment?

JamesMGreene commented 1 year ago

@briantist @tschaub

Is there a way to get all the content of a current pages deployment?

There is not currently a way to retrieve or reconstruct a Pages site's artifact after it is deployed today (or at least once the artifact's retention period has expired).

💡 It is an interesting idea, though! Definitely open to considering adding backend support and a separate Action for it in the future, but it's not something we can prioritize right now. 😕

If you wanted to work around it for now (without using a gh-pages style branch 😓), you could probably also create a less performant option by:

  1. Including some sort of manifest file (JSON, YAML, XML, whatever) cataloguing all of the files to be included in your Packages artifact, and then include that file in the artifact as well
  2. Creating an Action or script to subsequently download that manifest file, and then download each of the catalogued assets (plus needing to create directory structures, etc.).

It's not pretty, but just wanted to pitch the idea in case it helps. 🤞🏻 🤷🏻

JamesMGreene commented 1 year ago

FWIW, I've added an item to the Pages team backlog to consider this in the future. No promises that it will happen or any forecasted timeline, though. ❤️

JamesMGreene commented 1 year ago

Related: