NuGet / Home

Repo for NuGet Client issues
Other
1.5k stars 253 forks source link

[PackageReference] Centrally manage NuGet package versions for a solution or a repo #6764

Closed anangaur closed 2 years ago

anangaur commented 6 years ago

PackageReference enhancements are summarized as part of this Epic issue: https://github.com/NuGet/Home/issues/6763

jonathanperis commented 3 years ago

This is a nicely feature. Please move foward with this idea. Will match fine with the new grouping conventions for .NET 6 (C# 10) like global usings on a unique file.

I found out through an internet post that mentioned this way of doing it and I thought it was already a resource in production. It's a very smart and simple way to manage the dependencies of a project.

guylevy commented 3 years ago

This feature is in development quite long time – now it looks like there is no work on this so the question is – Are there any plans for releasing it? Is this approved or could never leave preview stage?

@rconard any update please on the release plan / date?

vivainio commented 3 years ago

This is a nicely feature. Please move foward with this idea. Will match fine with the new grouping conventions for .NET 6 (C# 10) like global usings on a unique file.

I found out through an internet post that mentioned this way of doing it and I thought it was already a resource in production. It's a very smart and simple way to manage the dependencies of a project.

It's already in production, but not documented as such. So it should be marked as shipped

zivkan commented 3 years ago

It's already in production, but not documented as such. So it should be marked as shipped

Centrally managed package versions is not in production.

The prototype/preview implementation required changes to core NuGet assemblies that we could not ship out-of-band, hence the preview feature shipped in production tooling. But the feature is still preview. Unfortunately, we also had other urgent work come up, so progress on this feature was deprioritized.

Also, there was a bug where our "this feature is preview" message was hidden from default verbosity in both Visual Studio and the dotnet cli. I added the preview warning to this wiki page, which many people seem to find, and it's still there on purpose.

I'm sorry we weren't more clear about this feature being preview, and still being in preview, but it is still in preview. Many features (like Visual Studio install/upgrade) has not been implemented, and I can't even guarantee that file syntax won't change before it is production ready. Hopefully not, but I can't promise.

ghost commented 3 years ago

We already saw cases where lockfiles failed to detect downloading of the wrong package version, only because a closer dependency ended up resolving a closer package version. Packages which are restored but not resolved don't end up in the log file

@aaronla-ms I'd love to learn more about that in a separate issue.

Apologies @nkolev92 , I'm not sure I got back to you on this.

It's pretty easy to repro. Set up the following dependency graph, where N/1.0 and N/2.0 are two different package versions of the same nuget package, clean your local nuget cache, and restore. You'll observe that both N/1.0 and N/2.0 are populated into the package cache, but N/1.0 doesn't appear in the lock file at all. This causes false positives in Component Governance for us, as N1 could be some unpatched package version. X, Y, and N are all nuget packages.

A.csproj  --> X/1.0 --> N/1.0
        \---> Y/1.0 --> N/2.0

The fix we've been needing to employ is adding a direct reference from A.csproj to N/2.0. That resolves the issue, and N/1.0 is no longer downloaded into the package cache.

vivainio commented 3 years ago

@zivkan consider releasing it without VS integration, it may be less important than you think. People use VSCode, Rider etc. People that need central package versioning were probably unable to use VS nuget support anyway (we are currently using Paket).

Hence, the "fundamentals" can be considered shipped and how VS deals with it is VS teams problem.

zivkan commented 3 years ago

My point is that even the "fundamentals" is not finalized. I really hope we won't change the MSBuild item names or file names we've used so far, but it depends on further customer development and customer interviews as to why customers who we want to use this feature are not yet using/evaluating it.

When the first previews shipped, we only had feedback from two open-source library authors, who had very small solution files, and they didn't like transitive package pinning, especially how this affected their own package's dependencies (NuGet being conservative about package dependencies & compatibility). At the other extreme, when we talked to certain "internal customers", people whose full time jobs were managing the CI and build systems for large products where there are hundreds if not thousands of developers doing product changes (and the product developers do very little "engineering system" work), they did not use or even evaluate the feature. I wasn't involved in those discussions, so I don't know why they wouldn't start using the feature. Maybe their concern was just preview quality feature, and "fundamentals" would be acceptable as-is. Maybe they just didn't have time. Or maybe the solution we designed will not solve the problems they are experiencing.

But until we're confident that the feature will work for a large variety of customers, not just the "simple use cases", there is a chance that we will need to break compatibility with the preview functionality that already shipped. I really hope not, but until more customer development is done we can't be confident that the preview that is already shipped is what we're going to support long term.

robmen commented 3 years ago

@zivkan If you want a more complex use case, feel free to reach out to me to discuss our use of central NuGet package version management in the WiX Toolset. Even in preview, central package version management is integral to maintaining sanity across all of our projects.

jimmylewis commented 3 years ago

When the first previews shipped, we only had feedback from two open-source library authors,...

I think this is the vocal feedback of a few people who didn't like the behavior... and at the time CPVM (which is highly desired by all, I think that's agreeable) was integrally coupled with transitive pinning (which obviously is contentious). I'm not sure how many people give feedback to say "yes, this works the way I want". Also, it was only available (AFAIK) for a relatively short window; I know in my case, I had only found out and adopted it about two months after the PR to remove the feature was submitted (what luck 😅 the very behavior that I wanted disappeared a few weeks later). Then when the feature was removed, at bunch of people asked to have it back as an optional feature, which was also repeatedly proposed before as pinning has always been a sensitive topic (I guess this is the other vocal cohort, who were previously happy and silent).

Or maybe the solution we designed will not solve the problems they are experiencing.

At this point, is it possibly worth relaxing the conservative stance and experimenting? If the feature is opt in and explicitly experimental (feel free to call it "EnableTransitivePinningWhichMayBreakNextWeek"), then at least we a) have an option for people who do or don't want it, and b) can get more experience with it to give better feedback about what works well or doesn't. This feels like something that is in high demand, but the impact of the feature is nuanced enough that I don't think a perfect solution will be drafted in committee.

zivkan commented 3 years ago

At this time, our limitation is resources (people to work on it) and prioritization against other work, not a lack of desire to complete this feature.

mikamins commented 3 years ago

At this point, is it possibly worth relaxing the conservative stance and experimenting? If the feature is opt in and explicitly experimental (feel free to call it "EnableTransitivePinningWhichMayBreakNextWeek"), then at least we a) have an option for people who do or don't want it, and b) can get more experience with it to give better feedback about what works well or doesn't. This feels like something that is in high demand, but the impact of the feature is nuanced enough that I don't think a perfect solution will be drafted in committee.

I completely agree with this sentiment. Opt-in pinning would be very useful for the Azure codebase, where we have to rely on ugly workarounds to avoid packages with known security vulnerabilities.

tebeco commented 3 years ago

OptIn / OptOut are generally a very good thing as it would allow consumer to change that behavior on demand without breaking changes from the tooling

this apply to almost all the ecosystem of the nuget tooling lockfile / resolution / pinning / and few others

I'm not saying "just do it" because it's never "just" I'm saying that instead of taking a position that would have to be supported for the next 10 year to avoid breaking changes, Some of these decision ajouts be defaulted but open to the consumer

swythan commented 3 years ago

BTW: The functionality to make the interaction between CPVM & transitive dependencies optional is waiting for a review here: https://github.com/NuGet/NuGet.Client/pull/4025

The author of that PR @marcin-krystianc is in my company's OSS group (I believe) and could probably help with feedback about how we use it in our large solutions. I'm not personally involved, but I know we've built a lot of infrastructure around trying to keep package usages consistent across projects and repos.

kant2002 commented 3 years ago

At this time, our limitation is resources (people to work on it) and prioritization against other work, not a lack of desire to complete this feature.

Is it possible that Nuget team will create only on high-level planning of the issue, what path should be taken and delegate all implementation to the community? This still require commitment to review PR in that area, but given that this is highly anticipated feature, this can be good way to increase count of active contributors to the project.

Even if nobody steps-in, this is can highlight to the community exact hard problems which are preventing you to "simply implement" CPVM.

vivainio commented 3 years ago

@zivkan Looks like you have been asking the wrong people to test the feature, and hence this is stuck in limbo.

The fundamentals I consider to be:

Just keep that working, and announce publically that it's shipped. There is no further work to do, and you can change details around that - e.g. transient dependency pinning. People are going to assume that looking up the version number from centralized file is exactly the same as specifying it in their PackageReference. There is going to be lots of actionable community feedback on further features once this is done.

markphillips100 commented 3 years ago

Is it possible to specify a property reference for the Version property on a PackageVersion? I'm specifically thinking of where I might want to supply a build property to conditionally change all referenced versions of a particular SDK's numerous packages, i.e. apply a condition in one place.

Directory.Build.props:

<Project>
  <PropertyGroup>
    <MySdkVersion Condition="' '$(MySdkVersion)' != '' ">1.0.0</MySdkVersion>
  <PropertyGroup/>
</Project>

Directory.Packages.props:

...
    <PackageVersion Include="MySdk.MyThing" Version="$(MySdkVersion)" />
...
martincostello commented 3 years ago

Yes it is. It even works with dependabot.

markphillips100 commented 3 years ago

Hmm I must be doing something wrong. Getting many, many NU1604 restore errors. Need to recheck.

martincostello commented 3 years ago

Here's an example of my usage, if it helps you: https://github.com/martincostello/alexa-london-travel-site/blob/4b3ec8442307e1f5f6eef0f9d910908b780916f2/Directory.Packages.props#L3-L11

markphillips100 commented 3 years ago

Thanks for that @martincostello . Helped to confirm I was on the right track.

I figured out my issue albeit I'm a little confused by the outcome. I'll explain just in case anyone else has the problem.

I have 2 folders, src and test, each having their own Directory.Build.props so they can add their specific PackageReference includes. I have a single Directory.Packages.props file in the parent folder.

I got lots of NU1604 errors when I added a 3rd Directory.Build.props file at the root beside the packages.props file. In this build.props file I had added the ManagePackageVersionsCentrally property and removed it from its original place in each of the subfolder build.props files. This had the net effect of disabling central version management thus giving all my package references null versions, hence the lower bound (NU1604) error.

So it seems that if I use nested build.props files to include package references I must specify the central property in the same file too. Not quite what I was expecting.

martincostello commented 3 years ago

I think that's because Directory.Build.props files override, rather than being additive, IIRC.

KalleOlaviNiemitalo commented 3 years ago

You can make the Directory.Build.props file in the inner directory import the file in the outer directory. The [MSBuild]::GetPathOfFileAbove property function is handy for this.

tg73 commented 3 years ago

My understanding (being quite experienced with customizing MSBuild) is as follows:

Directory.Packages.props is just another regular MSBuild targets file that gets imported (similar to C/C++ #include) at a specific point by some other targets file. In this case, it's imported by NuGet.props, which on my machine lives here: C:\Program Files (x86)\Microsoft Visual Studio\2019\Professional\Common7\IDE\CommonExtensions\Microsoft\NuGet\NuGet.props. The actual importing line is:

<Import Project="$(DirectoryPackagesPropsPath)" Condition="'$(ImportDirectoryPackagesProps)' == 'true' and '$(DirectoryPackagesPropsPath)' != '' and Exists('$(DirectoryPackagesPropsPath)')"/>

my point being is to show that it's just an imported props file, it's not subject to some special treatment or parsing. You can put any valid MSBuild 'code' in there. It follows all the usual rules for MSBuild items and properties, which is well documented here.

As regards using multiple Directory.Build.props/targets in a directory hierarchy, the docs explain how to do this.

If you want to better understand the order of imports, what gets imported from where and why, and basically shine a light on almost everything to do with how a build is working, the MSBuild binlog viewer is your friend. And consider yourself lucky that you didn't have to do any serious MSBuild development before the binlog format and viewer existed - it was like programming without a debugger.

jimmylewis commented 3 years ago

Just a fun observation:

Directory.Packages.props is just another regular MSBuild targets file that gets imported... You can put any valid MSBuild 'code' in there.

This also means the properties for opting in to CPVM (i.e. <ManagePackageVersionsCentrally>) can be put in this file as well, instead of having to split opting into CPVM in your normal Directory.Build.props and specifying CPVM PackageVersion entries in Directory.Packages.props.

And, because it's all MSBuild, you can override the PackageVersions specified in that file as long as the override is specified after that import. So if you have a single project that needs a different version, you can just put the <PackageVersion> entry in the .csproj, and it will override the central one (obviously, this is not a best practice to do regularly... but sometimes you need to).

stevenbrix commented 3 years ago

is there a reason that the spec for this is using <PackageVersion Include=".."/> instead of <PackageReference Update="../>? Version 2 of the CentralPackageVersions SDK moved away from the former to the later

nkolev92 commented 3 years ago

Decoupling package versions and explicit references was necessary to satisfy features such as transitive pinning.

avivanoff commented 3 years ago

The issue has been opened more than three years ago, and it is still in preview. Any chance it will be released in observable future?

ghost commented 3 years ago

I hope so. We're getting flagged left and right for vulnerable transitive dependencies, and its' hard to manage our force upgrades of them. Our current solution is a bit of a mess:

  <ItemGroup Condition="
       '@(PackageReference->AnyHaveMetadataValue('Identity', 'Microsoft.Data.Services.Client'))' == 'True'
    or '@(PackageReference->AnyHaveMetadataValue('Identity', 'Microsoft.Azure.DocumentDB.Core'))' == 'True'
    or $(TargetFramework.Contains('netcore'))
    ">
    <!--
      Microsoft.Data.Services.Client transitively brings in System.Net.Http/4.1.0, which needs upgrading

      Microsoft.Azure.DocumentDB.Core transitively brings in System.Net.Http/4.1.0, which needs upgrading

      Re: netcore in condition above: I haven't determined the exact cause, but it appears that 
      netcoreapp2.1 and netcoreapp3.1 project often pull down System.Net.Http when restoring if one of 
      their transitive project references had a transtive reference to it, even if said project had a closer 
      reference that should have overriden the transitive one. More investigation is needed to determine cause, 
      but is outside of our current level of expertise. 
    -->
    <PackageReference Include="System.Net.Http" Version="4.3.4" />
  </ItemGroup>

We've got 5 entries like this and growing.

Stefan75 commented 3 years ago

Hello,

my team is very interested in this whole topic. I searched related to this topic and I am now confused.

In a prototype I used following solution: CentralPackageVersions

Which looks like:

Packages.props

<?xml version="1.0" encoding="utf-8"?>
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <ItemGroup>
    <PackageReference Update="AdaptiveCards" Version="1.2.4" />
    <PackageReference Update="DryIoc.dll" Version="4.2.3" />
    <PackageReference Update="Google.Protobuf" Version="3.12.3" />
    <PackageReference Update="Grpc" Version="2.30.0" />
    <PackageReference Update="Grpc.Core" Version="2.30.0" />
    <PackageReference Update="Grpc.Net.Client" Version="2.28.0" />
    <PackageReference Update="Grpc.Tools" Version="2.28.1">  
      <PrivateAssets>all</PrivateAssets>
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>    </PackageReference>

...

and a csproj file like

<Project Sdk="Microsoft.NET.Sdk">
  <Sdk Name="Microsoft.Build.CentralPackageVersions" Version="2.0.79" />

  <ItemGroup>
    <PackageReference Include="AdaptiveCards" />
    <PackageReference Include="DryIoc.dll" Version="4.2.3" />
...

The solution described in this issue looks different. (e.g. Directory.Packages.props)

There is also a WIKI Page: Centrally managing NuGet package versions It seems that this Wiki page is more aligned to this issue.

Therefore my questions:

Thanks for your support and with best regards, stefan

Stefan75 commented 3 years ago

Hello,

is there already a way to ensure that all packages are inserted into the Directory.Packages.props file? (also the transitive dependencies)

Our company has strict rules that we must follow for external dependencies. Every dependency must be clarified and well documented in every sw release.

It is very important to know all used dependencies and their version.

With best regards, stefan

bddckr commented 3 years ago

@Stefan75 Microsoft.Build.CentralPackageVersions linked above offers enforcement: https://github.com/microsoft/MSBuildSdks/tree/main/src/CentralPackageVersions#enforcement

Not sure on the NuGet feature discussed here.

Stefan75 commented 3 years ago

@Stefan75 Microsoft.Build.CentralPackageVersions linked above offers enforcement: https://github.com/microsoft/MSBuildSdks/tree/main/src/CentralPackageVersions#enforcement

Not sure on the NuGet feature discussed here.

Both solutions provide enforcement in some way. My understanding is that both solutions share the idea that the centrally managed version cannot be changed locally in a project.

The question for me now is: Is it possible to enforce that the versions of all dependencies are defined centrally? (especially the transitive dependencies that are not part of any project file).

I found the following link, I'm not sure, maybe a lock file is a solution to lock the versions that are not defined in the central file. enable-repeatable-package-restores-using-a-lock-file

Any experiences with it?

bddckr commented 3 years ago

Transitive package version management was a feature in NuGet, but has since been removed, see #10389. The MSBuild SDK doesn't offer this feature, it depends on NuGet itself: https://github.com/microsoft/MSBuildSdks/issues/283

So right now, as far as I know, there is no way to define transitive package dependency versions without manually adding them as direct versions (and then somehow checking whether you missed some via the lock file perhaps). That's obviously not a solution that works for anyone.

But with the lock file you're on the right track IMO. My team has successfully been using the following for over a year without issues:

  1. The MSBuild SDK with enforcement turned on, forcing us to define direct dependencies without versions in the .*proj files and the version numbers in a Packages.props.
  2. RestorePackagesWithLockFile set to true.
    • We only have this set to true for app projects, library projects don't have it turned on as we only care about the "executables" - they are the only thing we deploy. This cuts down the amount of lock files we need to care about.
  3. RestoreLockedMode set to true, but only in our CI.
    • With this we get to enforce that the right packages (by hash) but also the correct versions of all packages are used.
    • Since the lock file includes transitive references, this might be the missing piece for you.

With this in place the local dev experience just works: Everyone is updating the lock files as they add or change packages automatically. When it comes to PRs we get to see and ensure the package lock files and the Packages.props changes as we expect.

Note: Since there's no UI for this currently, we've been adding packages to the .*proj files and setting the version in Packages.props manually. On nuget.org we just use the copy helper at the top to PackageReference and manually add it to the projects. Thanks to SDK-style projects being a thing nowadays, the IDE normally automatically reloads and restores the packages as needed once we save the .*proj file.

ghost commented 3 years ago

@bddckr be aware, even with lock files, the msbuild sdk will download additional versions of some nuget packages without telling you and without appearing in the lock file. It's possible to consume such assets from your build without being flagged by the lock file, e.g. if you <Reference> the path directly. This can happen for indirectly referenced packages, if more than one version is indirectly referenced (e.g. if you refer to A/1.0 and B/2.0 in your project, and A/1.0 -> C/1.0, but B/2.0 -> C/1.1 -- both C/1.0 and C/1.1 will be downloaded in some cases)

To avoid this, you need to direct reference every transitive package.

If you really need to be sure about your packages. I would recommend splitting apart package download and package restore. Our own internal build system used to do this -- first it would manually restore all packages using nuget.exe, which ignores transitive packages. Then the main build would msbuild /restore using an empty feed list -- that way, if a package wasn't already in the cache, the restore would fail. That would be my personal recommendation for now.

Stefan75 commented 3 years ago

@aaronla-ms is nuget responsible to download with/for the msbuild sdk addtional versions of some nuget packages?

hm.

Stimulated by the last posts, I ask myself the following questions:

  1. How can we create a reliable and trustful list of all consumed packages? (e.g. for an audit where we have to ensure that only legal packages are used)

  2. Is it possible to derive all licenses from this list to ensure that we meet all licensing obligations? (e.g. for an audit too)

  3. Do I have to treat some references different to others (e.g. related to licensing)? e.g. some packages are somehow "part" of .NET (and just distributed via nuget) and not an independent nuget package.

PS: Sorry, if I extend with this questions the content of this issue too much. But at least for me the central management of NuGet packages also has such implications (e.g. licensing)

tg73 commented 3 years ago

@Stefan75 The problem domain of package licensing and compliance can also be addressed at the nuget feed level. For example, Inedo ProGet has baked-in support for filtering external feeds by license. Managing your own feeds would also allow you to control precisely which packages may be consumed by your projects.

I'm not affiliated with ProGet in any way, it's just a product I use. There may well be other similar products.

Stefan75 commented 3 years ago

@tg73 Thanks for this comment. It seems that we are also planning to create an internal feed (AFAIK artifactory based) :)

By the usage of an own feed and by controlling the content of it we can ensure that just desired licenses are used (globally). But as long as we would use one feed across all versions of our product we still have to create project-version specific reports.

Example: Project V1 consumes ExtA V1 with some dependencies Project V2 consumes Exta V2 with different dependencies (and different licenses)

ghost commented 3 years ago

@Stefan75 I want to second @tg73 as well. Solving it in nuget/msbuild is just untenable at this point, but I forgot about solving it with a custom feed. If you configure nuget to use your custom feed and only your custom feed (and you disable the nuget fallback folder), then you're certain to only being using packages that have been published to your custom feed.

Just make sure your lab build cleans the nuget cache before each official build (nuget locals all -clear), so you don't have any stale packages from previous versions of the repo and/or feed.

tg73 commented 3 years ago

Further to @aaronla-ms - you can also perform project restore with arguments that disable cache use and specify the source to be used (see here for msbuild restore target, dotnet cli, nuget.exe). This works well unless you use custom MSBuild project SDKs which are resolved using the nuget resolver ('custom SDK' means your own nuget SDK packages referenced by <Sdk ...> elements in .csproj files, for example). In this scenario, the properties passed to nuget restore are not visible to the SDK resolver - it can only see ambient nuget.config files (https://github.com/NuGet/Home/issues/7855). I've worked around this by emitting an ambient nuget.config as part of the CI build process.

handerss-spotfire commented 3 years ago

Is there any update from the NuGet team with regards to transitive dependencies? As this comment mentions 1, the current documentation is misleading when it says that this also resolves transitive dependencies. Considering there is an unmerged PR 2 which solves this issue and that the .NET6 LTS release is imminent it would be good to have it merged before then. This issue must surely disrupt many large projects?

jimmylewis commented 3 years ago

@handerss-tibco Take a look at https://github.com/NuGet/Home/issues/10389#issuecomment-939191384 where the team has started reaching out. This won't make it for .NET 6, but hopefully we'll see it in .NET 7.

batzen commented 3 years ago

@jimmylewis Why won't this make it for .NET 6? The feature was removed in a minor version of .NET 5, so i see no reason why it couldn't be added to .NET 6 in a minor version.

HeikeHofmann commented 3 years ago

The feature was removed with VisualStudio 2019 v16.9.0. Thats means for example, that my company must stay on 16.8.7 and can't update to newer VS versions, because problem with transitive packages. .NET 7 will be far away then ...

vivainio commented 3 years ago

@HeikeHofmann what feature was removed? I think the feature being discussed here is still in newest versions of visual studio

batzen commented 3 years ago

@vivainio We are talking about transitive version pinning and that got removed.

HeikeHofmann commented 3 years ago

handerss-tibco ask for this topic 23 hours ago. Feature: transitive packages version pinning with directory.packages.props If i define versions, it will be ignored for transitive packages now, and we can't build stable environments anymore. (nuget and dotnet restore will download multiple versions of packages instead of the defined version. in most cases the defined and the minimum version then) The Support of transitive packages was changed with a new version of nuget, which was included then with VisualStudio 2019 v16.9.0. And if i read this open issue, it seems, that the problem still exists. i tested newest version v16.11.5 and problem still exists.

One Example: This line is part of my Directory.Packages.Props: <PackageVersion Include="NETStandard.Library" Version="2.0.3" /> But in my restored package folder, i can find version 2.0.0 and 2.0.3. And this is not acceptable.

ds1709 commented 3 years ago

I solved the problem of transitive dependencies by my own. I just wrote util that detects any package version conflicts in solution. It's not perfect solution, coz I still need to handle all dapandencies manualy, but it's better than nothing.

gdoron commented 2 years ago

@anangaur This feature is out for ~2 years and used by many companies and supported by things like Dependabot, but it is still in Incubation-preview mode. I'm wondering what's missing in order to make it 1st class citizen and when do we expect it to happen?

Best, Doron

anangaur commented 2 years ago

We have started to actively work on it. @JonDouglas is driving this effort

gdoron commented 2 years ago

Great to hear! It's such an important feature. Do we have an ETA? @JonDouglas