Closed NormalGaussian closed 2 years ago
@AlexTugarev Could this be related to the token change? :thinking: https://github.com/gitpod-io/gitpod/pull/7837
I've updated the issue with a link to a repo that can be cloned.
I have a similar issue with a private GitHub repo and the following error:
batch response: Bad credentials
error: failed to fetch some objects from 'repo'
EDIT: Can confirm that private vs. public changes the behavior.
Yep, duped on GitHub. It looks like for some reason git lfs (both v2.9.2) behaves differently on github to gitlab despite the same config. (github version does a git credential fill
before the first request, but gitlab version waits for the first 401 before filling credentials.
The gitlab version repeatedly requests until it is 403'd (server banning the IP), github servers seem to pre-empt this by returning a 403 instead of a 401.
GIT_TRACE=1 GIT_CURL_VERBOSE=1 git lfs fetch
:git config -l | grep lfs | sort
Output is the same on both (as expected).
@AlexTugarev Could this be related to the token change? 🤔 #7837
Its worth noting in that MR that the (effectively new) line
return tokenEntries.sort((a, b) => `${a.token.updateDate}`.localeCompare(`${b.token.updateDate}`))[0]?.token;
will prefer the oldest updateDate, which is presumably the wrong token, unless something else is making sure all tokens are valid.
This bug is no longer an issue:
As no work seems to have been done to directly address this issue I looked into the commits since the report to determine the issue and the fix. The following is the most likely place:
Changes to components/gitpod-cli/cmd/credential-helper.go
Noteworthy:
gp credential-helper get https://gitlab.com/<my repo>
fails to exit. This does break scripts that diagnose git issues, I'm unsure if it may also break git.Thank you for the detailed feedback @NormalGaussian This issue should have appeared in our inbox for triage, and handled with much more active communication. We will do what we can to improve that going forward.
If that's related to multiple tokens (which in itself is something to be solved, and potentially by simplifying the multi-meta setup) I just created the https://github.com/gitpod-io/gitpod/pull/8093 which addresses the issue from @NormalGaussian's comment.
@NormalGaussian I am very sorry for the missing attention. The fix https://github.com/gitpod-io/gitpod/pull/8093 seems to address this and is scheduled for deployment on Thursday 9am CET. It would be great if you could confirm then that it works.
It would be great if you could confirm then that it works.
It already works. From this comment:
This bug is no longer an issue:
- confirmed no longer duping on private repo's in both gitlab and github
As no work seems to have been done to directly address this issue I looked into the commits since the report to determine the issue and the fix. The following is the most likely place:
Changes to
components/gitpod-cli/cmd/credential-helper.go
- incidentally broken by : https://github.com/gitpod-io/gitpod/commit/9a7411f0e92e239142943a8aa07f4198736419eb
- incidentally fixed by : https://github.com/gitpod-io/gitpod/commit/f042238d861826f8f55f5e2741690dc8292dc980
I have no way to verify https://github.com/gitpod-io/gitpod/pull/8093 as it doesn't seem to have been the source of my issue. I would also emphasise my comments on that code were an outsiders passive observation of the linked MR. I don't want to intrude too much, but my spidey sense tells me its worth @AlexTugarev confirming he reversed the order after checking that the tokens were incorrectly ordered and not just as a reaction to the observation. (No unit test or investigation affirmation is what triggers me here). Without knowing the code getting it wrong seems to either be a non-issue (both tokens are always valid) or a major one (one token is to replace the other).
Either way, I'm happy this is resolved and won't object to you all closing this (only leaving open as I don't know github well enough to know whether @AlexTugarev will get a notification if I close).
Thanks for confirming that and for the additional pointers @NormalGaussian . cc: @AlexTugarev
Bug description
On the morning of Jan 27th GitLab repo's using git-lfs starting failing with 403 errors
Steps to reproduce
this will fail with a 403
Notes:
Workspace affected
Any based on a private gitlab repo using git-lfs
Expected behavior
git-lfs commands should complete without the 403
Example repository
Clone https://gitlab.com/Normal_Gaussian/gitpod-lfs-test.git and make your clone private
Anything else?
No response