Closed han16nah closed 2 years ago
Hmm I thought the idea of this test was whether a remote server (i.e., not 127.0.0.1) can be used to run
pytreedb
. Or are we considering this covered sufficiently in the other CI pipeline?You will also want to point the CI to the 'normal' PyPI instead of testing.pypi.org at some point.
@lwiniwar Shall we replace the test by one of the other server tests which we have in the python-app.yml, e.g. https://github.com/3dgeo-heidelberg/pytreedb/blob/06b56c6732da1627f26cc75e0ac77da2bfc600f3/.github/workflows/python-app.yml#L78
And sure, once you approve, we can exchange the link for publishing in the "normal" PyPi
I would say either (a) do all the tests that are also in the other CI or (b) do no tests in this script, and rely on the user not to push untested versions (or versions where the tests fail) to PyPI. Can we 'include' the other file in here? Something like explained here maybe... https://docs.github.com/en/actions/using-workflows/triggering-a-workflow#triggering-a-workflow-from-a-workflow
I would say either (a) do all the tests that are also in the other CI or (b) do no tests in this script, and rely on the user not to push untested versions (or versions where the tests fail) to PyPI. Can we 'include' the other file in here? Something like explained here maybe... https://docs.github.com/en/actions/using-workflows/triggering-a-workflow#triggering-a-workflow-from-a-workflow
I agree. For me - at the current stage - it makes no sense to run different tests for CI than for publishing. Including the existing tests would be a good option because it then only needs one place to maintain. If this is not possible and a redundant test implementation would be required, I opt for skipping it because CI should guarantee consistency anyway.
We could try sth like:
on:
workflow_run:
workflows: [Test with MongoDB]
types: [completed]
branches: [main]
jobs:
build:
runs-on: ubuntu-latest
if: ${{ github.event.workflow_run.conclusion == 'success' }}
steps:
...
(see: https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#workflow_run)
This should only build if we push to main and if the tests in the other file have succeeded.
Further questions
We currently have a commit hash hardcoded in the python-publish.yml
. Would we simply update this commit hash in the .yml for each time we want to release on PYPI? Is this better than pointing to a branch (e.g. main
)? Or shall we use tags, e.g.
- name: Publish package
if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags')
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
(taken from: https://github.com/marketplace/actions/pypi-publish)
Other ways of triggering pypi releases
In py4dgeo
, releases are done to PYPI/TEST PYPI manually (using a workflow_dispatch
event) and based on some user input
on:
workflow_dispatch:
inputs:
deploy_to_testpypi:
description: "Whether the build should be deployed to test.pypi.org"
required: true
default: "false"
deploy_to_pypi:
description: "Whether the build should be deployed to pypi.org"
required: true
default: "true"
and then later
upload_pypi:
needs: [build-sdist, build-wheels, upload_testpypi]
runs-on: ubuntu-20.04
if: github.repository_owner == 'ssciwr'
steps:
- uses: actions/download-artifact@v2
with:
name: artifact
path: dist
- uses: pypa/gh-action-pypi-publish@master
if: github.event.inputs.deploy_to_pypi == 'true'
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
(https://github.com/ssciwr/py4dgeo/blob/main/.github/workflows/pypi.yml) What do you think of this?
We could also run the workflow on a release event:
on:
release:
types:
- published
obsolete
Hmm I thought the idea of this test was whether a remote server (i.e., not 127.0.0.1) can be used to run
pytreedb
. Or are we considering this covered sufficiently in the other CI pipeline?You will also want to point the CI to the 'normal' PyPI instead of testing.pypi.org at some point.