diffpy / diffpy.pdffit2

real space structure refinement to atomic PDF
Other
4 stars 16 forks source link

try docker release workflow #67

Open 8bitsam opened 1 month ago

8bitsam commented 1 month ago

Add a release.yml workflow file that should use docker to release on pypi. I also fixed a problem where the stage on pre-commit-config.yaml wasn't right.

8bitsam commented 1 month ago

This should build using github's docker setup, might be easier than building our own for now.

Sparks29032 commented 1 month ago

Thanks Sam, this is great.

please see comment on the pre-commit.

This looks good. @Sparks29032 please can you review?

I think we will need an action for building the changelog from the news. Is there anything else that is being handled by the bash scripts that should go here?

I think we discussed before whether we should build and deploy from actions and decided against it for some reason but I don't remember why. Andrew, do you remember? If this works for pdffit2, should we do all the releasing from here?

This seems to be on push/ Does it mean that a new release is done every time we merge into main? How do we specify the new release number?

The bash scripts will still need to be run to create the tag and upload to a GitHub release. This seems to only handle PyPi releases correct? We will need to put the secrets (PyPi user and pass) into each GitHub repository we want to release with this approach, which seems like a pain since updating the password on PyPi would require us to change the password secret on every GitHub repository we will release using this approach. If we are able to, let's try to either add a workflow like this to the release scripts or make the docker ourselves.

Still setting up my Ubuntu, but this seems conditioned on setup.py. Can we replace python setup.py sdist bdist_wheel with some variation of python -m build?

Also, has this been tested? It seems we will need to make a dummy repository and branch that sends items to PyPi in order to test this workflow.

8bitsam commented 1 month ago

I switched it to use build, also I could make a repo to test it on. We could change the on: push part to on: workflow_dispatch so it could be run with CLI? I think that might be a better option since it's more controlled.

Also yes, this only handles PyPI releases. I figured it's easier than making a whole docker setup locally since the github matrix os seems like it should be able to handle that effectively.

sbillinge commented 1 month ago

I switched it to use build, also I could make a repo to test it on. We could change the on: push part to on: workflow_dispatch so it could be run with CLI? I think that might be a better option since it's more controlled.

Also yes, this only handles PyPI releases. I figured it's easier than making a whole docker setup locally since the github matrix os seems like it should be able to handle that effectively.

Don't we want the GitHub release payload to also have all the wheels though? Does this actually solve our problem?

I am ok with a github workflow approach in general, but only if it is the best solution.

I don't think we want to keep pypi secrets in the repo. Isn't this a security issue?

I didn't spend a lot of time with it, but it seems that actions/upload-artifact is based on actions/artifact which maybe builds the artifacts but doesn't upload them? This would seem to be a better match to what we want? This one could even be run on push maybe if it just builds the wheels but doesn't do the release on push. Then if our current release workflow knows where to find the pre-built wheels and doesn't try and build them, but puts them into a payload for GH and then for PyPi? Something like that could be attractive possibly?

8bitsam commented 1 month ago

Don't we want the GitHub release payload to also have all the wheels though? Does this actually solve our problem?

I am ok with a github workflow approach in general, but only if it is the best solution.

After doing some research, it seems like this is probably the most efficient way to do the releases. I changed the workflow so it can only be manually activated, that way we can choose exactly when to release it. I also added a second job that publishes the release to PyPI.

I don't think we want to keep pypi secrets in the repo. Isn't this a security issue?

The secrets thing shouldn't be a security issue if the proper workflow for setting it up is followed here.

I think we can also have this workflow create a new tag for us with the github deploy action, so we don't need to use the bash script.

8bitsam commented 1 month ago

I didn't spend a lot of time with it, but it seems that actions/upload-artifact is based on actions/artifact which maybe builds the artifacts but doesn't upload them? This would seem to be a better match to what we want? This one could even be run on push maybe if it just builds the wheels but doesn't do the release on push. Then if our current release workflow knows where to find the pre-built wheels and doesn't try and build them, but puts them into a payload for GH and then for PyPi? Something like that could be attractive possibly?

I think that by using upload-artifacts, we can store the wheel files, but just using artifacts won't allow us to do that. This way we can also handle the actual publishing to PyPI on the same workflow.

Another thing we can do is make a similar workflow to release on conda-forge, that way all the releasing could be handled from two github workflows.

sbillinge commented 1 month ago

@Sparks29032 please chime in. I know we discussed this before and chose not to do it this way for some reasons.

Also, I don't see the news workflow in the current action? Is this possible if we switch to GH actions? Let's keep the convo going. We worked very hard to get the shell scripts working, and they are working, so there has to be a significant benefit to do the work to switch it. I expect this work would fall to Sam.....

Sparks29032 commented 1 month ago

I am a bit biased toward scripts as we can be more expressive in Python. With these workflows we will have to update them in every package if we want changes in behavior while the release repository makes it centralized.

We will end up having to use scripts anyways for news and automatic version control and building the api documentation.

The conda-forge workflow is a bit different than pypi as the only upload is the meta.yaml, but there may be workflows for this also.

Let's also try testing this upload pathway works with a dummy repository. I have one repo set up already, but my computer is still not fully set up yet.

sbillinge commented 1 month ago

Let's not pursue this further at this point for doing releases. We worked hard to get things automated (that was the goal of this entire project), so I am reluctant to change to a different way of doing it unless it is absolutely clear it is superior. If , after a couple of years, we find issues with the bash scripts we can revisit this then.

If there are useful matrix workflows for building wheels on different platforms that we can then reuse in the scripts, this would seem to be of some value. We need to do the snake-nest for getx3 releases though in any case, so I guess if we figure this out here, we can reuse a lot of that stuff for the getx3 releases.