niemeyer / gopkg

Source code for the gopkg.in service.
Other
537 stars 85 forks source link

"cannot talk to GitHub" #63

Closed shivamMg closed 6 years ago

shivamMg commented 6 years ago

For some reason it cannot talk to GitHub. The requests time out. GH is up. Must be something between your server and GH.

screenshot_20180911_141120

niemeyer commented 6 years ago

Right, GitHub is not accepting connections from gopkg.in right now. I'm in touch with support trying to figure out why and get it sorted.

niemeyer commented 6 years ago

As a side note, if you have a good contact at GitHub for that kind of issue, please raise attention so we can get it back up ASAP. If they take too long to respond I may end up switching IP addresses for a while so they can take their time to evaluate the issue, but I'd prefer to sort it the proper way instead.

sysradium commented 6 years ago

Unfortunately no :( Maybe switching IPs is a way to go now ...

niemeyer commented 6 years ago

We have a hot spare at https://p3.gopkg.in/yaml.v2 ready to switch. Let's give them another moment to evaluate the issue with the domain pointing to the right location, and if it takes too long we switch.

niemeyer commented 6 years ago

From GitHub:

I'm speaking with our infrastructure teams and we've just updated our status while we work on this. I'll follow up with you as soon as we have an update.

niemeyer commented 6 years ago

Reflected in https://status.github.com/

niemeyer commented 6 years ago

Update from support:

We think we found the problem machine and are looking for a root cause.

Seems to be working now. Might be fixed, but given that feedback we might still see a few bumps.

niemeyer commented 6 years ago

And it's sorted:

We've just deployed some changes to resolve this and I can see that https://gopkg.in/mgo.v2 is now loading again. We've updated our status back to green.

Sorry for the trouble this caused you and your users. Don't hesitate to follow up if you're still seeing any problems or if you have any questions.

sysradium commented 6 years ago

@niemeyer unfortunately the problem is back

niemeyer commented 6 years ago

There we go again... just got in touch with them again.

Ads20000 commented 6 years ago

Problems have been reflected on GitHub System Status

andrewslotin commented 6 years ago

It looks like the same issue is happening again. I'm currently experiencing the same behaviour while trying to install the gometalinter.v2:

$ curl https://gopkg.in/alecthomas/gometalinter.v2
Cannot obtain refs from GitHub: cannot talk to GitHub: Get https://github.com/alecthomas/gometalinter.git/info/refs?service=git-upload-pack: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
niemeyer commented 6 years ago

@andrewslotin Please see the last few messages above.

It also looks like it's almost sorted, from their status page:

image

baha-ai commented 6 years ago

GitHub issue still persisting https://github.com/golang/go/issues/27622 was closed but my build is still failing.

Thanks!

mdittmer commented 6 years ago

Is this a special case of this issue?

go get -u gopkg.in/src-d/go-git.v4/...                     
# cd [...]/src/gopkg.in/src-d/go-git.v4; git pull --ff-only                           
remote: Cannot obtain refs from GitHub: cannot talk to GitHub: Get https://github.com/src-d/go-git.git/info/refs?service=git-upload-pack: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
fatal: unable to access 'https://gopkg.in/src-d/go-git.v4/': The requested URL returned error: 502
package gopkg.in/src-d/go-git.v4: exit status 1
package gopkg.in/check.v1: unrecognized import path "gopkg.in/check.v1" (parse https://gopkg.in/check.v1?go-get=1: no go-import meta tags ())
package gopkg.in/src-d/go-git-fixtures.v3: unrecognized import path "gopkg.in/src-d/go-git-fixtures.v3" (parse https://gopkg.in/src-d/go-git-fixtures.v3?go-get=1: no go-import meta tags ())
baha-ai commented 6 years ago

looks like it, it's a general Github http 502 error

esetnik commented 6 years ago

@niemeyer can you fail over to hot spare https://p3.gopkg.in/yaml.v2 please?

niemeyer commented 6 years ago

@esetnik Since this is a more general issue on GitHub and the issue comes and goes, we have no guarantees that shifting won't present the same problem. Also, GitHub is using our service to debug their outage, so I'm keen on not masking the problem for them so it can be fixed faster and for good.

Per their support:

It looks like https://gopkg.in/mgo.v2 is accessible at the moment and our team are making use of this URL as they work to resolve the issue. I'll follow up again when I have another update for you.

We also had an update moments ago:

image

adria0 commented 6 years ago

there's any known workaround for golang module system? Like adding/modifying something in the go.mod?

esetnik commented 6 years ago

@esetnik Since this is a more general issue on GitHub and the issue comes and goes, we have no guarantees that shifting won't present the same problem. Also, GitHub is using our service to debug their outage, so I'm keen on not masking the problem for them so it can be fixed faster and for good.

Per their support:

It looks like https://gopkg.in/mgo.v2 is accessible at the moment and our team are making use of this URL as they work to resolve the issue. I'll follow up again when I have another update for you.

We also had an update moments ago:

image

I'm not sure what's different about your connection, but I definitely cannot resolve https://gopkg.in/mgo.v2. I think it's a bit unreasonable for Github team to assume third-parties won't alter their own infrastructure to do everything they can to mitigate upstream affects of the outage for their users.

brandoncole commented 6 years ago

We haven't been able to build our packages in CircleCI for 8+ hours because of this GitHub issue. We might be able to override some IPs in the build containers but this is a really crappy situation to be in. I'm experiencing all the issues pointed out above while this occurs on the Github side.

glb commented 6 years ago

Issue appears to be resolved from here at least! 🎉 My build just passed.

image
baha-ai commented 6 years ago

same here... but hopefully it's not intermittent.

sublee commented 6 years ago

The GitHub status says everything operates normally since 6 hours ago: ("한국 표준시" in the screenshot means KST, UTC+9)

image

When I visit http://gopkg.in/yaml.v2 in a Web browser, it works as well. But my build in Travis CI is still failing (gin-gonic depends on go-yaml in gopkg.in):

$ dep ensure -update github.com/hangulize/hangulize
Solving failure:
    (1) failed to list versions for https://gopkg.in/yaml.v2: remote: Not Found
fatal: repository 'https://github.com/go-yaml/yaml/' not found
: exit status 128
    (2) failed to list versions for http://gopkg.in/yaml.v2: remote: Not Found
fatal: repository 'http://github.com/go-yaml/yaml/' not found
: exit status 128

Is there anyone who has an idea about this experience?

jney commented 6 years ago

maybe they do have cache. I just opened travis an issue

niemeyer commented 6 years ago

Apparently they've solved the issue for good, so I'm closing this one as well.

Sorry for the trouble everyone.

sublee commented 6 years ago

My problem https://github.com/niemeyer/gopkg/issues/63#issuecomment-420475234 was not due to this service issue. The reason was GIT_HTTP_USER_AGENT customized by Travis CI's deploy stage. Travis CI sets it as GIT_HTTP_USER_AGENT=travis/0.1.0 dpl/1.10.0 git/2.15.1 only in the deploy stage.

GitHub's git HTTPS URL without .git, like https://github.com/go-yaml/yaml rather than https://github.com/go-yaml/yaml.git, fails on an unexpected User-Agent header.

$ GIT_HTTP_USER_AGENT='' go get gopkg.in/yaml.v2
# cd .; git clone <https://gopkg.in/yaml.v2> .../go/src/gopkg.in/yaml.v2
Cloning into '.../go/src/gopkg.in/yaml.v2'...
error: RPC failed; HTTP 422 curl 22 The requested URL returned error: 422 Unprocessable Entity
fatal: The remote end hung up unexpectedly
package gopkg.in/yaml.v2: exit status 128
acloudiator commented 5 years ago

Github status seems all good for today, but I encountered this issue again.

package gopkg.in/inf.v0: unrecognized import path "gopkg.in/inf.v0" (https fetch: Get https://gopkg.in/inf.v0?go-get=1: dial tcp 35.196.143.184:443: i/o timeout)
package gopkg.in/mgo.v2: unrecognized import path "gopkg.in/mgo.v2" (https fetch: Get https://gopkg.in/mgo.v2?go-get=1: dial tcp 35.196.143.184:443: i/o timeout)
package gopkg.in/mgo.v2/bson: unrecognized import path "gopkg.in/mgo.v2/bson" (https fetch: Get https://gopkg.in/mgo.v2/bson?go-get=1: dial tcp 35.196.143.184:443: i/o timeout)
adityaalifn commented 2 years ago

@niemeyer it is happening again intermittently

Screen Shot 2021-11-15 at 14 43 07
aagamdoshi commented 2 years ago
Facing the below error : 

remote: Cannot obtain refs from GitHub: cannot talk to GitHub: Get https://github.com/go-yaml/yaml.git/info/refs?service=git-upload-pack: net/http: request canceled (Client.Timeout exceeded while awaiting headers)

@niemeyer 
jivot commented 2 years ago

+1 our ci/cd jobs are failing with something similar

go: github.com/rubenv/sql-migrate@v0.0.0-20200616145509-8d140a17f351 requires
    gopkg.in/gorp.v1@v1.7.2: unrecognized import path "gopkg.in/gorp.v1": reading https://gopkg.in/gorp.v1?go-get=1: 502 Bad Gateway
    server response: Cannot obtain refs from GitHub: cannot talk to GitHub: Get https://github.com/go-gorp/gorp.git/info/refs?service=git-upload-pack: net/http: request canceled (Client.Timeout exceeded while awaiting headers)
niemeyer commented 2 years ago

I'm looking into it. Given the multiple reports, it's of course a real issue, but it's not easy to debug it on my end as this looks like GitHub failing to respond the request on time. If I can't find anything here, I'll reach out to GitHub and see if someone can help me.

kulkarnisamr commented 2 years ago

looks like it's working now.

update: spoke too soon, it's down again.

SamuliVirtapohja commented 2 years ago

Our ci/cd seems to pass currently.

Has there been any update/post somewhere about this?

niemeyer commented 2 years ago

We couldn't find any relevant information on our end that could explain timeouts with GitHub. I've filed a bug with GitHub, which remains silent. Their status page is also quiet about any issues yesterday, so I'm somewhat empty handed.

Would anyone here happen to have a precise timestamp for when the problem was observed, from your CIs?

SamuliVirtapohja commented 2 years ago

Success 15.11.2021 at 12:21 PM GMT +2 Success 15.11.2021 at 12:36 PM GMT +2 <- incidents happen from here on out Failed 15.11.2021 at 12:40 PM GMT +2 Failed 15.11.2021 at 12:41 PM GMT +2 Failed 15.11.2021 at 12:47 PM GMT +2 Failed 15.11.2021 at 1:02 PM GMT +2 Failed 15.11.2021 at 1:49 PM GMT +2 Success 15.11.2021 at 2:26 PM GMT +2 Failed 15.11.2021 at 3:06 PM GMT +2 Success 15.11.2021 at 3:22 PM GMT +2 Success 15.11.2021 at 4:38 PM GMT +2 Failed 15.11.2021 at 5:00 PM GMT +2 Failed 15.11.2021 at 5:59 PM GMT +2 <- This is the last failed run that our logs show Success 16.11.2021 at 8:48 AM GMT +2 <- Success from here on out

There was also some issues with the pipelines for our provider at that time but this is the closest thing that I could find https://status.dev.azure.com/_event/272370766

(also sorry from poor formatting)

kulkarnisamr commented 2 years ago

The earliest run that failed for me was around Nov 15, 2021 2:38 AM (UTC-8:00)

niemeyer commented 2 years ago

Thanks for the data, folks. Still zero feedback from GitHub. My best guess at the moment is that there was some inconsistent hiccup there and it's been addressed. For future cases, please note that any such hiccup in GitHub ends up showing up as a failure from gopkg.in itself. In such cases the log is clear about what particular operation is failing and how.

Feel free to report here anyway, though, even more if you see https://www.githubstatus.com/ and there's nothing being reported there. It's useful to get some perspective on how it's failing and where.

Finally, just for some comfort: Canonical has 24x7 monitoring on this service, and a team that takes good care of it. So actual issues in gopkg.in itself are going to be looked after.

niemeyer commented 2 years ago

Some feedback from GitHub:

image

niemeyer commented 2 years ago

Over the last few days we've also been doing some analysis with help from the IS team at Canonical, and we found that a relevant percentage of requests, which amounts to over one million requests a day, is being performed by one small for-profit organization. We'll probably introduce a quota to encourage cases like this to be more carefully designed.

LKay commented 2 years ago

This is happening again... Any chance it will be resolved and gopkg.io becomes stable? IT's been happening for over 2 years randomly, as a result packages hosted on here are not reliable in production environment.

lgosse commented 2 years ago

Same here for about two hours

SamuliVirtapohja commented 2 years ago

Yup, just checking in here. Happened to us too.

kaushiknag90 commented 2 years ago

This is happening to us as well and causing a lot of failures in our CI pipeline while downloading packages to build the binaries. Could you please help us with a work around or a permanent fix for the same.

co60ca commented 2 years ago

Would replacing the problematic source with a github link instead using replace in go.mod help?

SamuliVirtapohja commented 2 years ago

Would replacing the problematic source with a github link instead using replace in go.mod help?

I mean it would help but the amount of work required for a not long term solution is not realistic.

niemeyer commented 2 years ago

Could you please help us with a work around or a permanent fix for the same.

I cannot fix GitHub permanently. I cannot even fix it temporarily. The service provided by gopkg.in depends entirely on GitHub itself being able to serve requests. I have no visibility on how GitHub dispatches requests internally, but what I'm doing right now is adding a more strict quota on organizations that are clearly abusing the system so that we can try to make GitHub happier about the requests.

niemeyer commented 2 years ago

Just to confirm that this indeed the same issue, would anyone have the error message and timestamp at hand?

SamuliVirtapohja commented 2 years ago

Can provide them tomorrow.

kaushiknag90 commented 2 years ago

Some of the same error message and times are below. The timestamp is in IST. `[Container] 2021/11/25 16:01:00 Step 14/27 : RUN go mod download

---> Running in c5d727371cc6

go: gopkg.in/yaml.v2@v2.3.0 requires

gopkg.in/check.v1@v0.0.0-20161208181325-20d25e280405: invalid version: git fetch -f origin refs/heads/*:refs/heads/* refs/tags/*:refs/tags/* in /go/pkg/mod/cache/vcs/9241c28341fcedca6a799ab7a465dd6924dc5d94044cbfabb75778817250adfc: exit status 128:

remote: Cannot obtain refs from GitHub: cannot talk to GitHub: Get https://github.com/go-check/check.git/info/refs?service=git-upload-pack: net/http: request canceled (Client.Timeout exceeded while awaiting headers)

fatal: unable to access 'https://gopkg.in/check.v1/': The requested URL returned error: 502

The command '/bin/sh -c go mod download' returned a non-zero code: 1 `

[Container] 2021/11/25 20:45:33 Running command make unit-test go test -count=1 -covermode=atomic -coverprofile=coverage-unit.out ./... go: bitbucket.org/swigy/xp-go-client@v0.0.0-20210728131314-2270d5ec76f6 requires gopkg.in/sourcemap.v1@v1.0.5: reading gopkg.in/sourcemap.v1/go.mod at revision v1.0.5: unknown revision v1.0.5 Makefile:21: recipe for target 'unit-test' failed make: *** [unit-test] Error 1