NuGet / Home

Repo for NuGet Client issues
Other
1.5k stars 252 forks source link

Allow users to determine package resolution strategy during package restore - direct or transitive #5553

Open Wil73 opened 7 years ago

Wil73 commented 7 years ago

I am not going to debate if highest or lowest should be the default: leave the actual behavior up to the consumer.

Why on earth should Microsoft FORCE users to adopt the "safest is best" strategy in the first place?

Is there any reason why we cannot add a switch to nuget.exe and NPM that allows the user to decide their own package restore strategy?

This way consumers of nuget can align nuget behavior to their own business strategies, be they conservative or aggressive.

What I don't understand is why Microsoft has to decide this for consumers in the first place.

To me this is a straight up consumer decision based on private business strategy and policy.

It makes absolutely no sense why a tool vendor should allow themselves to unilaterally decide business strategy for all nuget consumers on the planet without an option to override that.

It's been two years and there's no apparent movement on this issue.

Leave lowest version as the default strategy if you want Microsoft, but for goodness sakes offer a switch to allow consumers to make their own business decisions.

davidfowl commented 7 years ago

I am not going to debate if highest or lowest should be the default: leave the actual behavior up to the consumer.

Lowest should be the default, but that's just my opinion. It's the best way to get a "working set of packages". Everyone in the graph gets what they compiled against and things only roll forward minimally if sibling packages need higher versions.

Why on earth should Microsoft FORCE users to adopt the "safest is best" strategy in the first place?

It's more of a "safest is default strategy" because people want to get a working set of packages by default. In a system where you need binary compatibility between components to in order to make things function things have a better chance of working if you stay closer to what components build, and test against.

Is there any reason why we cannot add a switch to nuget.exe and NPM that allows the user to decide their own package restore strategy?

This is already possible when using packages.config based projects by picking the dependency behavior on install:

image

We don't own NPM, that's the node package manager, not sure if you mean something else.

To properly do this with PackageReference, a lock file needs to be introduced (which has been discussed anyways).

Wil73 commented 7 years ago

I am not going to debate what the default ought to be. I really am not concerned over that. Technically this is NOT a default when you can't configure it otherwise. It's only a default if you can change it. We can't.

My company is frustrated that Microsoft will not allow businesses to DECIDE Nuget resolution behavior for themselves.

It makes absolutely no sense why Microsoft should have a monopoly on the business policies of the private sector: this is in effect what Microsoft does here.

Make package resolution configurable via a simple switch.

By NPM I mean NuGet Package manager in Visual Studio.

I don't see why any lock file is required. Just add an additional switch in nuget.exe and Nuget Package Manager that indicates if lowest, highest etc should be the expected behavior.

Package.Config is for NuGet 1 and 2. That's well deprecated by now.

This is clearly a business decision that Microsoft currently has constrained to Microsoft to decide.

There are many switches in Nuget already. I don't understand why we cannot simply add one more.

I fail to understand why it takes over two years for Microsoft to resolve this situation. The current work around is to explicitly declare EVERY dependency in package references (for Nuget 4) and in Project.Json (for Nuget 3). This is a clumsy and labor intensive work around.

Just add a darn switch to indicate Nuget's expected resolution behavior so users can manage it themselves.

It makes no sense why Microsoft must control this in the first place.

Just add a switch.

davidfowl commented 7 years ago

I don't see why any lock file is required. Just add an additional switch in nuget.exe and Nuget Package Manager that indicates if lowest, highest etc should be the expected behavior.

It's required with PackageReference because there's no way to make restore deterministic if you don't add it. If there's no asset you can commit to source control to lock dependencies then restoring the latest versions of all dependencies could give dramatically different results build to build. That might not be an issue if you tightly control the feeds but it is in the general case so just adding a switch isn't sufficient for the masses. This feature needs to be paired with a lock file to make it sane to use for the majority of customers.

Wil73 commented 7 years ago

I think having the flexibility to choose resolution strategies is just as important as having a deterministic build. I do agree a deterministic build is important, but so is flexibility, which has been in short supply for long enough.

There is so much debate as to what the restore default should be (highest or lowest), but the point is irrelevant: higher or lower should be a choice the business makes, not Microsoft.

If I want a predictable build I can always just stick with project.json, check in my project.lock.json file with lock set to true, or just use fixed version numbers (the latter is the most predictable way, but the least flexible and most labor intensive). A package can get de-listed at any moment, so defaulting to lowest version does not guarantee a predictable build output either.

Microsoft needs to add switches to allow business to define their own resolution defaults. Microsoft has no business making these decisions for businesses in the private sector.

davidfowl commented 7 years ago

There is so much debate as to what the restore default should be (highest or lowest), but the point is irrelevant: higher or lower should be a choice the business makes, not Microsoft.

Sure. Like I said, it's possible with packages.config and that functionality should be ported to PackageReference (project.json is pretty much deprecated at this point) along with a way to lock the graph after resolution.

I think having the flexibility to choose resolution strategies is just as important as having a deterministic build. I do agree a deterministic build is important, but so is flexibility, which has been in short supply for long enough.

I think you can have both though. IMO it wouldn't be great if we had a feature that most people couldn't use because we didn't think through the end to end.

If I want a predictable build I can always just stick with project.json, check in my project.lock.json file with lock set to true,

That doesn't work. Locked: true isn't supported anymore (hasn't been for a while).

or just use fixed version numbers (the latter is the most predictable way, but the least flexible and most labor intensive).

Fixed versions of the entire dependency graph would work, but it's extremely labor intensive.

A package can get de-listed at any moment, so defaulting to lowest version does not guarantee a predictable build output either.

I believe this isn't respected when resolving dependencies in project.json so it is deterministic.

Wil73 commented 7 years ago

project.json will resolve to the lowest version it can find. If the version the build uses normally gets de-listed your deterministic build is out the window.

If you are going to use dependencies from nuget.org or any other third party site, you cannot get a deterministic build every time, even with nuget restore working as it does now.

When you depend on a third party for some of your libraries you cannot determine what your build will look like every time no.

The alternative is to host any libraries you use locally so you have complete control over their list status.

I think we both agree package.config with fixed version numbers is far too labor intensive.

Project.json is far more flexible than package.config. Package.config is even more deprecated and out of date than project.json.

We still use project.json where I work along with Nuget 3.5 since Nuget 4 does not offer any more flexibility with regards to package resolution options.

When Microsoft smartens up and lets go of its monopoly on package resolution strategy logic we will adopt a later version.

As it stands now Nuget 4 does not offer anything that Nuget 3.5 doesn't already do for us. Locked: true does work for us with Nuget 3.5. Though we don't want fixed version numbers. We can update a library quite regularly. We need flexibility here more than we need predictability.

We use TFS automated builds with a fully featured Artifact Server here as well. If we need any build of any configuration we can pull that from TFS in just seconds. Flexible builds are a more pressing need for my company than deterministic builds.

The Lower version is no guarantee of safer. If that was the case we would all still be running Windows 95.

davidfowl commented 7 years ago

project.json will resolve to the lowest version it can find. If the version the build uses normally gets de-listed your deterministic build is out the window.

  1. It's super uncommon to de-list things. That said, I'm not even sure restore respects this.
  2. If you delete things, your build is broken anyways.

If you are going to use dependencies from nuget.org or any other third party site, you cannot get a deterministic build every time, even with nuget restore working as it does now.

You can 98% of the time (I made that up), because de-listing on nuget.org is rare and deleting is impossible. Now, if you're mirroring your dependencies to a myget feed or controlled feed, it's even better since you'd rarely delete or de-list anything that was being depended on.

However, de-listing/deleting is rare, if we support resolving the latest dependency version without a sane story to lock it, we're turning the 2% case into the 98% case.

When you depend on a third party for some of your libraries you cannot determine what your build will look like every time no.

Not following this argument.

Project.json is far more flexible than package.config. Package.config is even more deprecated and out of date than project.json.

We still use project.json where I work along with Nuget 3.5 since Nuget 4 does not offer any more flexibility with regards to package resolution options.

Hopefully you'll switch to PackageReference since that's where all of our energy is at the moment. Good to know locked: true still works in the version of the client that you're using.

All of that said I agree with you, I just think when we do that feature it'll be accompanied with a lock file so that things remain deterministic. That shouldn't be too hard to accomplish regardless.

Wil73 commented 7 years ago

I would have to double check for Nuget 4, but you can get a deterministic build using highest version with a version number limit. I don't think Nuget 4 will allow this scenario out of the box either. It's not 100% predictable, but the point is we cannot even configure that.

If you want to lock, then use VS 2015 and Nuget 3.5. You just have to check in your project.lock.json. That works just fine. I haven't investigated locking for Nuget 4 and VS 2017, but it sounds like Microsoft dropped the ball yet again.

I like the high level design of NuGet, but the implementation is dreadful on many levels. MS has had over five years now to get NuGet to a sensible level and it's still not there.

If locks are out the window with Nuget 4, and Nuget 4 STILL won't offer a switch for businesses to configure their own business policies how is Nuget 4 any better?

Moving references to the project file accomplishes what? That work effort could have been put into offering what businesses need: configurable restore policies and restore locks. I don't recall seeing a single post of anyone complaining that the project.json file format urgently needs a renovation.

There are hundreds (I haven't counted, but I stopped a long time ago) of people complaining about the inflexible lowest is best logic hard coded into Nuget right now.

On the face of it, it looks like MS wants everyone to use the latest version of their software, but the oldest version of everyone else's software. I find it ironic how aggressively Microsoft endorses adoption of new releases of their products, but Nuget logic has a "lowest is safer" constraint hard coded into the binary.

If lowest was safer we would all still be running Windows 95.

davidfowl commented 7 years ago

I would have to double check for Nuget 4, but you can get a deterministic build using highest version with a version number limit. I don't think Nuget 4 will allow this scenario out of the box either. It's not 100% predictable, but the point is we cannot even configure that.

Not only is it not predictable, it's probably more broken (especially for dependencies out of your control).

If you want to lock, then use VS 2015 and Nuget 3.5. You just have to check in your project.lock.json. That works just fine.

VS 2017 with PackageReference is the future of nuget and it needs a proper lock file story.

Wil73 commented 7 years ago

How is getting the latest version "out of control?"

Our approach at our company is to use a fixed version number for third party dependencies and use the highest version for in house packages. We don't update third party dependencies nearly as often as we update internal packages. We have a very large in house class library framework that relies on nuget to be sensible: right now it's not. We can make nuget sensible, but we have to hack project.json to make that happen.

This is not out of control. Our dependencies are quite stable. We have a solid change management process and automated testing. Adopting a new version when that version gets released does not make a process out of control. It just makes your company agile and aggressive in adopting the new.

This goes back to my original point: this is a controversial debate left to the business to decide, not a tool vendor.

I cannot say for any given company if lower or higher version resolution strategy is better. I think it depends on the company and how aggressively they want to adopt new tools. Again this is such a controversial topic it is absurd to have a tool vendor make these decisions for everyone and hard code those policies into their binaries.

That being said, just offer a darn switch. That ends all debate on the matter. lol

davidfowl commented 7 years ago

How is getting the latest version "out of control?"

You misread, I said "dependencies that are out of your control".

We don't update third party dependencies nearly as often as we update internal packages.

Sure. How do you restrict the resolution strategy to only update certain dependencies?

This goes back to my original point: this is a controversial debate left to the business to decide, not a tool vendor.

I think I agreed with you each time you made this comment but my position hasn't changed when it comes to needing a lock file with this feature.

That being said, just offer a darn switch. That ends all debate on the matter. lol

I have little say in what the nuget team does here so I can't say how or when this would ever happen. I'm guessing it won't work on VS 2015 with project.json though (I could be wrong). I'm also guessing this mode needs to be source controllable (probably in nuget.config) since you'd want command line restore and VS restore to do the same thing.

Wil73 commented 7 years ago

You can use project.json syntax to restrict to specific versions or simply de-list other packages so you only have one version in your package line. It's a hack, but it works.

Yes, you would want a switch or option for both NPM and the command line restore of course.

anangaur commented 7 years ago

That's a long thread over the weekend. :)
From what I have observed (here - this thread as well as talking to other folks), the requirements seem to be the following:

  1. Allow users to decide the dependency resolution strategy - direct as well as transitive
  2. Allow users to lock down on the dependency versions in the transitive world - to enable repeatable build

Both # 1 and # 2 seem to be lacking with PackageReference. # 2 seems like already working with lowest is the best strategy but will falter with # 1 coming into the fore. So I think NuGet will have to solve both together atleast for PackageReference.

giggio commented 7 years ago

What is the timeline to resolve this? This is very much needed. Requests for this goes back to 2015 (see #4789 and https://github.com/aspnet/dnx/issues/2657).

Wil73 commented 7 years ago

I think they just closed it and ignored me. And yes, MANY people have raised this as a SERIOUS flaw in the Nuget system. It's been over five years and it still behaves the same way with no flexibility.

I have seen MANY tickets opened for this and they just close them and walk away.

davidfowl commented 7 years ago

I think they just closed it and ignored me.

Closed what? The issue is still open. Like I said before, a lock file is needed before any realistic progress can be made here.

And yes, MANY people have raised this as a SERIOUS flaw in the Nuget system. It's been over five years and it still behaves the same way with no flexibility.

It has been raised many times but honestly when you talk through the implications, people ignore the consequences that don't affect their immediate needs. We need to think about the larger ecosystem impact when we design features like this.

As a super basic simple example, any package that transitively references Newtonsoft.Json say 6.0.8, will now start pulling in 10.0.1. Not only that, but you end up with a completely jagged untested dependency graph by default. Here's an example of that going bad:

https://orientman.wordpress.com/2017/08/22/how-_not_-to-upgrade-to-asp-net-core-2-0-just-yet-with-paket/

I also think @anangaur wrote a spec for a proposed solution to the lock file but I can't find it on the wiki.

anangaur commented 7 years ago

@Wil73 We closed many related issues that were essentially the same ask in this issue. This issue is still open :)

Here is the "Enable repeatable builds via lock file" spec.

Wil73 commented 7 years ago

I have yet to grasp what this has to do with adding a switch of some sort to nuget RESTORE to ensure that packages resolve by default to the HIGHEST version available?

Wil73 commented 7 years ago

I fail to see how on earth a lock file has any relevance at all here either.

anangaur commented 7 years ago

We already discussed this in the earlier comments. We would need to handle both scenarios together.

giggio commented 7 years ago

Just use a default that maintains the current behavior, and allow for different expressions of version ranges. Npm does this brilliantly with the ^ and ~ characters, as I pointed out on #4789. This way people that don't want to change the current behavior just keep doing what they do, and people who do want it use the new characters. Don't get me wrong, I love the idea of a lockfile, but these are orthogonal needs and can be worked in parallel.

Wil73 commented 7 years ago

Sorry, but that's just silly. Every time you come up with a new version you then have to edit all the packages that use the new version to increase the darn range.

We have a as an option right now, but that only works for direct dependencies... child dependencies still resolve to LOWEST version despite the in the grandparent library. Ranges will not resolve this problem; ranges will just create a maintenance nightmare.

anangaur commented 7 years ago

@giggio Enabling option for users to choose the resolution strategy is good but aggravates the "repeatable build" scenario. Something we do not want to do. Hence we would like to solve the both together (rather 'lock file' solution followed by 'user defined resolution strategy' <- this issue)

Wil73 commented 7 years ago

Yes, I understand you want a build that's 100% predictable. I do get it. I just don't think that's as urgent as a flexible business strategy. This problem has been in the air for years. The workarounds are an administration nightmare.

If they can be resolved separately and quickly please explore that as an option first.

davidfowl commented 7 years ago

Yes, I understand you want a build that's 100% predictable.

Not 100% any percent really. If you float every dependency every time restore is run it's extremely unpredictable unless you control all of the feeds.

We have a as an option right now, but that only works for direct dependencies... child dependencies still resolve to LOWEST version despite the in the grandparent library. Ranges will not resolve this problem; ranges will just create a maintenance nightmare.

@emgarten do we support in nuspec dependencies? I though that wasn't supported today. @Wil73 did you manually enter in your nuspec and run nuget pack?

Wil73 commented 7 years ago

We use the * in our project.json files and that works fine. BUT we have to include EVERY dependency in the entire tree to ensure we get the most recent version. That is the most common work around people use right now. It's ugly and a maintenance nightmare, particularly for complex dependency trees.

We do not include any version in our nuspec files. It doesn't matter so long as the application exe builds with a * for every nupkg he's after.

It's VERY easy to miss one since the application will build without it (silently pulling in OLDEST instead).

It's ugly and prone to error, but it works... for now. Our program manager hates it.

emgarten commented 7 years ago

@emgarten do we support * in nuspec dependencies? I though that wasn't supported today.

Floating versions in nuspecs are ignored.

davidfowl commented 7 years ago

@emgarten good 😄 .

It's VERY easy to miss one since the application will build without it (silently pulling in OLDEST instead).

That is the most common work around people use right now. It's ugly and a maintenance nightmare, particularly for complex dependency trees.

I really honestly don't understand why people would want the latest version of every dependency without the ability to build with those resolved versions more than once. The only reason this works in packages.config worked is because it resolved dependencies once then gave you a set of fixed assemblies and dependencies that were checked into source control. Without that, I don't see how this feature is usable.

Wil73 commented 7 years ago

Follow me on this David.. it does NOT matter. It does not matter if you can see how this would be usable. This is not the province of engineers to see that. They build what business asks for, that's it. This is a stability vs featureset throughput decision. Businesses make these decisions, not engineers.

This is CLEARLY a business need. I have read literally over a HUNDRED posts of people asking for this. So you build it. You don't debate the sense of building to latest version. Some businesses want that aggressive strategy, some want build to oldest. Nuget should let the business decide for themselves.

Allow your users to decide for themselves how aggressively they want to adopt new versions. You build in the flexibility for the business to define this. Engineers have no business making these sorts of decisions unilaterally for every one of their consumers. This IS a business decision.

I will however leave you with this thought: if the oldest version were always the best choice, we would all still be running Windows 95...

giggio commented 7 years ago

I think you are being paternalistic. You should not decide how I build software. It is my decision, not yours. What you are saying is "this is how I do it, and we should all do it this way". Please don't do that. I do agree that in most projects people do not want that, as it may break things. But in a cutting edge project maybe I want to be on the latest of everything, or latest minor, or patch. Maybe that is ok, because the project is just starting. Or maybe I want the latest patch version because it fixes bugs. I don't know. But please don't limit me. I have used ^ in node without a lockfile, and very rarely did the build break. When the project reached a more stable version we added a lockfile. For some dependencies we removed the ^. It was not a nightmare. And the node ecosystem changes much more rapidly than the .NET's ecosystem. Packages are released much more often.

giggio commented 7 years ago

Also, it would be very important to have an option to exclude prereleased versions.

Wil73 commented 7 years ago

I am not saying that at all. Are you guys not reading what I write? From day 1 I have argued that Nuget should be flexible enough to support each individual business strategy. I am most certainly not being paternalistic here. The problem is Nuget assumes that resolving to oldest version is best.

This is certainly not the case. Nor is resolving to latest version always best. What I am and have been saying is far more subtle than this.

We have David above arguing why oldest is best. I really think this is a moot point:

Each business must decide how conservative or aggressive their package adoption strategy ought to be.

Unfortunately Nuget does not offer a flexible resolution strategy here. Under the covers Nuget assumes oldest is best for child dependencies. This is a big problem for some companies who do not want oldest is best.

I am most certainly not saying we should do things any one way. I am the one here arguing for a more flexible package adoption strategy. A strategy that offers a package resolution strategy switch/toggle that the business/engineers configure for each project themselves. Rather than have all this AI built under the covers that decides unilaterally how aggressive or conservative a business's package adoption strategy is we should be able to configure this ourselves. How on earth can anyone unilaterally know how aggressive or conservative any given company's package adoption strategy ought to be?

Add a switch of some sort that allows Nuget to restore and resolve packages the way each business needs. A customizable and configurable Nuget AI would be a huge gain. Right now Nuget Resolution strategy is a black box and not configurable. Nuget decides and we just have to live with it.

This has been a problem for six years now. It's time to fix it.

gyrolc commented 6 years ago

Is there a timeline for this resolution? We have the exact same business need to always get the highest available version of our internal packages within a specified range.

Wil73 commented 6 years ago

My biggest concern is that this will get dealt with, but the implementation will be inadequate. I don't understand at all why this takes so long. Just add a switch to nuget.exe to pass into the business logic. How can this be that complicated?

I must be missing something.

anangaur commented 6 years ago

@Wil73 We typically publish the spec and announce it at the NuGet/Announcements repo for review. We respond to feedback and once reviewed, we begin the implementation. So please rest assured that we will take in your feedback :) Please subscribe to NuGet/Announcements to not miss out on the notifications.

@gyrolc This is on our backlog and we plan to work on this once we implement the features outlined on our Roadmap blog post. You can also keep track of the upcoming features by subscribing to the NuGet/Announcements repo.

davidfowl commented 6 years ago

My biggest concern is that this will get dealt with, but the implementation will be inadequate. I don't understand at all why this takes so long. Just add a switch to nuget.exe to pass into the business logic. How can this be that complicated?

Why not send a pull request with the change then?

Wil73 commented 6 years ago

That's a great idea David. Why don't you send a pull request to implement the changes you are asking for? I look forward to using them. Thank you!

springy76 commented 6 years ago

Regarding "old is safe":

Everyone in the graph gets what they compiled against

Sorry this is not true: https://github.com/domaindrivendev/Swashbuckle.AspNetCore/issues/438#issuecomment-360066292

In that case it might be nice that Swashbuckle got its 1.0.4 StaticFiles it had been compiled against. But that StaticFiles gets NOTHING it got compiled against since 99,999% of the project already was at aspnetcore 2.0 at the time Swashbuckle got added and transitively pulled StaticFiles.

davidfowl commented 6 years ago

@springy76 the assessment of what caused that issue is incorrect. I clarified in the comment.

springy76 commented 6 years ago

The biggest problem with this: No one tells you anything is wrong. You get a MissingMethodException sometime at runtime, maybe many hours or days after starting the app.

While transitive references may not be the direct cause for the problem the missing tooling for showing the availability of updates of transitive references is. With packages.config this would not have happened.

Or may I remember 2018-03-13: Some smart guy decided to delist all 2.0.0./2.0.1 aspnetcore NuGet packages due to CVE-2018-0787 which just blew up any package restore. Our build server just could not build the exactly same sources it had built successfully only some hours ago without having changed anything (but the NuGet servers did). https://github.com/aspnet/Home/issues/2954#issuecomment-373064006

markusschaber commented 6 years ago

@springy76 That's precisely why one should never ever let his build servers build against a public repo.

Use a local repository for the CI systems, and restrict the build servers from accessing anything non-local. Apart from reproducible builds, this also helps with license compliance and security @assessments, as you only have "known" packages with "known" licenses.

abatishchev commented 6 years ago

you only have "known" packages with "known" licenses.

How so? Manual control of each package? You can do that either way. And to automate you need tooling

one should never ever let his build servers build against a public repo.

Oh, please. This is terrible burden on developers. Plus you again need tooling to sync local with public.

The only worse advice is to check-in packages under source control for same, illusional reasons

markusschaber commented 6 years ago

We need local-only builds anyways - OSS licenses is only one reasons, some other are:

Also, for OSS license compliance, every package has to be audited whether the license is clearly declared (surprisingly many packages come without license, or with contradicting or otherwise unclear legalese), and whether it's allowed in the context the package is used.

But some (exotic) licenses which restrict usage are just not acceptable for us or our customers, others like GPL are only acceptable in specific contexts, and others need specific caution, for example the "advertising clause" in the OpenSSL license results in some guidelines for our documentation and marketing teams.

Yes, that whole process is a high burden for developers in .NET core and even more in angular/npm land, but it's the only way to prevent ticking bombs for us and our customers. And, yes, we're using lots of tooling to assist developers.

NightWatchman commented 6 years ago

Is there a timeline for this resolution? We have the exact same business need to always get the highest available version of our internal packages within a specified range.

I agree with this and hope to see a resolution soon for making the DependencyVersion property, which already works for package.config projects, work for projects that use the PackageReference method as well.

I see the value of "lowest is safest" when the Version is specified with something like 4.3.2 which, by default, resolves to >=4.3.2, but that argument goes right out the window if the user is specifying the version with something like [3.0.22,3.0.24], or [1.5.2,3). Clearly anybody who's bothering to go through the effort of specifying a version range that specific did so because they believe that the project works with any of the specified versions. Since it works with any of the specified versions, restricting it to the lowest possible version in the specification without any way of modifying that behavior is just really annoying.

giggio commented 6 years ago

Another 6 months went by and we got not response. This is a priority:1 issue that is on the backlog for 3 years. Shouldn't this be at least planned for a release?

mikenunan commented 6 years ago

I'd like to add my voice to this, having just landed here after some frustrating time spent trying to specify a "latest" dependency in a nuspec. There absolutely are sane use cases for this, I can't quite believe it's not possible to simply use to pull in a floating latest version, or 1. or 1.3.* etc to restrict, or to use ranges somehow.

In my case I just need "latest" because the dep contains build-time tooling (that does some validation work) that I precisely do want moving forward as new versions are released. Now this means I'm going to have to update the dependent components every time that tooling component changes.

michaelTBF commented 6 years ago

We want latest - if I specify 1.2. I will always like the updates made to 1.2. - nothing else makes sense.

Microsoft push Pipelines and CICD really much - but the basics behind the scenes are still way old.

I know you can make this better.

GirishVilli commented 5 years ago

Hi, Any update on this issue. Need a solution to control dependency resolution behavior for transitive dependencies

TysonMN commented 5 years ago

There was much talk above about first needing support for lock files when specifying packages (via PackageReference in project files). This feature was released on November 13, 2018 with Visual Studio 2017 version 15.9.0. Here is a direct link to the feature documentation.

anangaur commented 5 years ago

This feature is on our backlog but not one of the top ones right now and yes its unblocked now since we implemented the lock file feature.

@bender2k14, @GirishVilli can you elaborate a bit on your scenario even if its re-iteration of the reasons discussed above? Feel free to state it here or reach out to me via an email.