uber / tchannel-go

Go implementation of a multiplexing and framing protocol for RPC calls
http://uber.github.io/tchannel/
MIT License
484 stars 81 forks source link

Golang Dep Issue with yarpc-go & tchannel-go #684

Closed kevinguard closed 6 years ago

kevinguard commented 6 years ago

Using yarpc for an open source project which uses golang/dep for dependency management, when issuing dep ensure -v, fetching process freezes as a result of the following error:

(20)  ? attempt go.uber.org/yarpc with 3 pkgs; 102 versions to try
(20)      try go.uber.org/yarpc@v1.27.2
(20)  ✓ select go.uber.org/yarpc@v1.27.2 w/21 pkgs
(21)  ? attempt gopkg.in/validator.v2 with 1 pkgs; at least 1 versions to try
(21)      try gopkg.in/validator.v2@v2
(21)  ✓ select gopkg.in/validator.v2@v2 w/1 pkgs
(22)  ? attempt gopkg.in/yaml.v2 with 1 pkgs; at least 1 versions to try
(22)      try gopkg.in/yaml.v2@v2
(22)  ✓ select gopkg.in/yaml.v2@v2 w/1 pkgs
(23)  ? attempt gopkg.in/mgo.v2 with 1 pkgs; at least 1 versions to try
(23)      try gopkg.in/mgo.v2@v2
(23)  ✓ select gopkg.in/mgo.v2@v2 w/2 pkgs
(24)  ? revisit golang.org/x/sys to add 1 pkgs
(24)    ✓ include 1 more pkgs from golang.org/x/sys@master
(24)  ? attempt go.uber.org/multierr with 1 pkgs; at least 1 versions to try
(25)      try go.uber.org/multierr@v1.1.0
(25)  ✓ select go.uber.org/multierr@v1.1.0 w/1 pkgs
(25)  ? attempt github.com/uber-go/tally with 1 pkgs; 17 versions to try
(26)      try github.com/uber-go/tally@v3.3.2
(26)  ✓ select github.com/uber-go/tally@v3.3.2 w/1 pkgs

Signal received: waiting for 1 ops to complete...
(27)    ← no more versions of github.com/uber/tchannel-go to try; begin backtrack
  ✗ solving failed

Disclaimer: I am an UBER employee (@kaveh)

prashantv commented 6 years ago

I don't think there's anything in TChannel that can be changed? The glide.yaml has reasonable version constraints for all dependencies, no branch/tag pins.

Is there a concrete action item for TChannel here?

kevinguard commented 6 years ago

@prashantv Had discussed about this in an internal thread and we have identified the root cause. We can safely close this. Thanks!