python / core-workflow

Issue tracker for CPython's workflow
https://mail.python.org/mailman/listinfo/core-workflow
Apache License 2.0
95 stars 59 forks source link

Fork workflow breaks when branch is in upstream #481

Open jaraco opened 1 year ago

jaraco commented 1 year ago

Today I was working on an issue in miss-islington. In particular, I was working on https://github.com/python/miss-islington/pull/590. Per the dev guidance, I have the repo configured with a fork:

 miss-islington main $ git remote -v
origin  https://github.com/jaraco/miss-islington.git (fetch)
origin  https://github.com/jaraco/miss-islington.git (push)
upstream        https://github.com/python/miss-islington (fetch)
upstream        https://github.com/jaraco/miss-islington.git (push)

I used the gh tool to check out the pull request:

 miss-islington main $ gh pr checkout 590
branch 'update-ci-py-versions' set up to track 'upstream/update-ci-py-versions'.
Switched to a new branch 'update-ci-py-versions'

I made a revision and then wished to push that revision to the branch from which I'd pulled it. However, because I've configured upstream for jaraco/miss-islington (push), pushing the changes didn't go to the original branch but went to my fork instead:

 miss-islington update-ci-py-versions $ git commit -a -m "Pin kombu on Python 3.12. Workaround for celery/kombu#1600"
[update-ci-py-versions 8a06689] Pin kombu on Python 3.12. Workaround for celery/kombu#1600
 1 file changed, 3 insertions(+)
 miss-islington update-ci-py-versions $ git push
Enumerating objects: 5, done.
Counting objects: 100% (5/5), done.
Delta compression using up to 8 threads
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 1.03 KiB | 1.03 MiB/s, done.
Total 3 (delta 2), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (2/2), completed with 2 local objects.
To https://github.com/jaraco/miss-islington.git
   c01dfc5..8a06689  update-ci-py-versions -> update-ci-py-versions

I guess that makes sense. A branch probably only tracks its "remote" and not the actual resource from which it was pulled.

As a result, I needed to manually push the changes to the actual upstream, which led to another problem:

 miss-islington update-ci-py-versions $ git push gh://python/cpython
Enumerating objects: 1390, done.
Counting objects: 100% (1390/1390), done.
Delta compression using up to 8 threads
Compressing objects: 100% (499/499), done.
Writing objects: 100% (1390/1390), 318.04 KiB | 318.04 MiB/s, done.
Total 1390 (delta 865), reused 1384 (delta 862), pack-reused 0
remote: Resolving deltas: 100% (865/865), done.
remote: 
remote: Create a pull request for 'update-ci-py-versions' on GitHub by visiting:
remote:      https://github.com/python/cpython/pull/new/update-ci-py-versions
remote: 
To https://github.com/python/cpython
 * [new branch]      update-ci-py-versions -> update-ci-py-versions
 miss-islington update-ci-py-versions $ git push gh://python/cpython :update-ci-py-versions
To https://github.com/python/cpython
 - [deleted]         update-ci-py-versions

Because I needed to manually type the URL, I mistakenly used python/cpython (out of habit) instead of the correct repo (python/miss-islington), so I ended up pushing draft commits for miss-islington to cpython (oops). I hope the deletion effectively undid that change. I then pushed the changes to the intended target.

 miss-islington update-ci-py-versions $ git push gh://python/miss-islington
Enumerating objects: 5, done.
Counting objects: 100% (5/5), done.
Delta compression using up to 8 threads
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 1.03 KiB | 1.03 MiB/s, done.
Total 3 (delta 2), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (2/2), completed with 2 local objects.
To https://github.com/python/miss-islington
   c01dfc5..8a06689  update-ci-py-versions -> update-ci-py-versions

That worked and triggered the CI run again.

What I wonder - is there a way to configure the fork workflow such that when one checks out a PR in the upstream repo that it's tracked properly?

I imagine one could configure upstream to also push to the upstream repo, but that also increases the risk someone would accidentally push a branch to upstream.

CAM-Gerlach commented 1 year ago

I imagine one could configure upstream to also push to the upstream repo

Personally, as someone who's always used the fork workflow, I just keep my upstream pointing to the actual upstream. However, I have it set up to use HTTPS while pushes to my fork use SSL, so just in case I do make a mistake, I'm prompted for my username and password before it does anything so its clear I've made a mistake.

but that also increases the risk someone would accidentally push a branch to upstream.

Actually, this isn't possible to accidentally do anymore (for non-admins/RMs) now that python/core-workflow#460 is implemented on cpython (and devguide, peps, and this repo, etc.), so you should be safe there even without doing anything special on your end.