NuGet / Home

Repo for NuGet Client issues
Other
1.49k stars 250 forks source link

Cannot install, uninstall, or update ANY packages if one NuGet source is unreachable #12950

Open levicki opened 9 months ago

levicki commented 9 months ago

NuGet Product(s) Affected

Visual Studio Package Management UI, Visual Studio Package Manager Console

Current Behavior

Consider this scenario:

NuGet sources configured in Visual Studio:

  1. nuget.org (https://api.nuget.org/v3/index.json)
  2. GitHub (personal repo https://nuget.pkg.github.com/l33tc0d3r/index.json, needs authentication)
  3. CONTOSODEV (corporate server \CONTOSODEV\NuGet, needs VPN connection)
Source mappings configured in Visual Studio: Package Source
com.contoso.* CONTOSODEV
l33tc0d3r.* GitHub
* nuget.org

Packages installed in the open project:

Now when you want to update Newtonsoft.Json you can't if:

  1. You aren't connected to the corporate VPN
  2. You don't provide credentials for private GitHub packages repo (VS does not remember password across sessions*)

You cannot even remove the existing packages or install any new ones from nuget.org which should be possible even without having explicit mappings.

I do understand the compatibility and security concerns if you were to fall back to different package repository, but this is beyond ridiculous and user-hostile behavior. If a package was installed from nuget.org, it's maintenance should not be affected by other sources being temporarily inaccessible.

Desired Behavior

  1. If a user tries to update or remove a package installed from a specific source and that source is accessible that should succeed — it shouldn't be all or nothing.
  2. If there is a mapping that says download *.com.contoso from \\CONTOSODEV\NuGet and everything else from nuget.org and a user tries to install Rick.Mapperly that should succeed even if \\CONTOSODEV\NuGet is not accessible otherwise what's the point in having source mappings in the first place?

Additional Context

zivkan commented 9 months ago

edit: fixed the link to the duplicate issue 🤦

I'm closing this as a duplicate of https://github.com/NuGet/Home/issues/6373

It's not ideal, but a mitigation (work around, not a "solution") is to use disabledPackageSources in the nuget.config to (temporarily) disable the sources when you're not on the relevant network. dotnet nuget list source and dotnet nuget disable source <name> is a quick way to do it.

There's also dotnet restore --ignore-failed-sources. If you use floating versions, this might not work though.

Also, in the last year or two Package Source Mapping was added. If you have it configured so the package you're installing/upgrading, AND ALL ITS DEPENDENCIES are mapped to nuget.org (or other package sources that are available), then I think it should work, but again, only if you don't use floating versions.

Also note that VS 17.8 and .NET 8 previews/rc have a new feature NuGetAudit, where NuGet tries to download known vulnerable package lists from the package sources, so NuGet will retry to talk to the package sources every time the package graph changes (or --force is used), even if all the packages are already in the global packages folder. Use dotnet restore -p:NuGetAudit=false to mitigate this issue, if this is the problem.

You cannot even remove the existing packages or install any new ones from nuget.org which should be possible even without having explicit mappings.

If all the packages that are require are already in your global packages directory, then removing a package should not need communication with any package source (if NuGetAudit is also disabled). However, the dependency resolving algorithm can sometimes work in some unexpected ways. However, NuGet does download and extract (some) packages from the graph that are not used by the project (higher/different versions were selected for the project), specifically to maximise the chance that offline scenarios work.

Generally, if NuGet is trying to contact package sources, it means that at least one package in the graph is not available in the global packages folder (or floating versions are used, or NuGetAudit is enabled). For better or worse, --ignore-failed-sources is not default, which makes sense for floating versions, but I feed like it should theoretically be safe when floating versions are not used, and all exact package versions are found.

All this to say that I would normally expect that removing a <PackageReference from a project to work offline.

Anyway, I'm closing this as a duplicate. As per our triage policy, we prioritize effort based on a number of factors, one of which is the number of upvotes an issue has, so please consider adding a 👍 reaction to that issue.

levicki commented 9 months ago

@zivkan

I'm closing this as a duplicate of https://github.com/NuGet/Home/issues/6373

I am not sure if that's really an exact duplicate -- that issue is just about updating and restoring. Mine is also about installing new packages and about source mapping not behaving as (I) expected.

It's not ideal, but...

That seems neither practical nor quick to me at all and I don't think it would work.

Let's say I disable the personal GitHub repo in Visual Studio. I still won't be able to use Manage NuGet Packages GUI to update Newtonsoft.Json in the project because I can't tell Visual Studio GUI to ignore net.levicki.* packages whose source I just disabled.

Also, in the last year or two...

I have these sources:

image

And these mappings:

image

I must admit I am ignorant about those floating versions you keep bringing up, but I would expect that if GitHub is unreachable here (because VS doesn't remember credentials which is another bug), then say Newtonsoft.Json package should be possible to install/update/remove. From my testing it's not possible, but then again maybe my expectations on how this should work are unreasonable?

If all the packages that are require are already in your global packages directory, then removing a package should not need communication with any package source

That's the theory, but have you guys actually tested that with packages with and without dependencies?

All this to say that I would normally expect that removing a <PackageReference from a project to work offline.

Yeah, I would too. It kind of does work "offline", but only if you redefine "offline" to mean "closed Visual Studio and removed <PackageReference> from the .csproj file using Notepad++".

I kindly thank you for the time you put into writing such a detailed answer and for the workarounds offered, but what you wrote sadly still reads like a laundry list of excuses for a product behavior that doesn't follow the principle of least astonishment.

zivkan commented 4 weeks ago

I don't know how I missed that the original issue already talked about package source mapping. 🤦

HotSeat (or maybe I'll find time soon-ish, maybe) to confirm if having all packages already in the global packages folder, then trying to install a package that matches a nuget.org source mapping pattern will fail due to sources that shouldn't be needed not being available.

It might just be NuGetAudit trying to download the vulnerabilities database, although I feel like that also should not break restore/install.

zivkan commented 4 weeks ago

I did a super quick test of a slightly different scenario than the one described above. A nuget.config with two sources, nuget.org with package mapping * and nuget.contoso.test with package mapping contoso.*. Using a project with no package references at all, installing a nuget.org package from VS worked for me, and I got a NU1900 warning about being unable to load the service index for nuget.contoso.test (although it didn't say WHY it couldn't load the service index). So far this is by design, although it would cause package installation to fail if the project opts into "treat warnings as errors". I think that's a very non-intuitive customer experience, but it's been an issue ever since PackageReference was added around 2016.

However, dotnet add package does not work, because NuGet.CommandLine.XPlat.Utility.AddPackageCommandUtility.GetLatestVersionFromSourcesAsync does not respect package source mapping.

What still needs to be tested is having a contoso.whatever package installed, so that when installing a new package from nuget.org, we see if the preview restore still tries to access the contoso package source unnecesserily. Whoever tests, make sure you also dotnet nuget locals http-cache --clear otherwise if any http-cache files are less than 30 minutes old, you're going to get different behaviour than after 30 minutes.

nkolev92 commented 3 weeks ago

What still needs to be tested is having a contoso.whatever package installed, so that when installing a new package from nuget.org, we see if the preview restore still tries to access the contoso package source unnecesserily. Whoever tests, make sure you also dotnet nuget locals http-cache --clear otherwise if any http-cache files are less than 30 minutes old, you're going to get different behaviour than after 30 minutes.

It didn't tried to access it with PackageReference projects in my test.

levicki commented 3 weeks ago

What version of Visual Studio and NuGet are you testing? For me it is definitely more relevant what happens in Visual Studio when managing solution packages compared to command line.

The test scenario involves having 2 package source mappings which aren't always accessible — one because it needs VPN, and the other one because it needs credentials (which VS wasn't remembering).

nkolev92 commented 3 weeks ago

17.11 Preview 2

zivkan commented 3 weeks ago

I just tested with the GA version of VS (17.10.2), and here's my sample repro: gh12950.zip

I added nuget.* to contoso's patterns, and reference NuGet.Protocol in the project. You'll need to uncomment the nuget.* pattern from nuget.config for the first restore, to get into the test's initial test state, then uncomment.

In VS, I can't install NuGet.Commands, for example, as expected. What I didn't expect is that preview restore seemed to work, and it was only after accepting the changes that it finally failed.

However, I was able to install Microsoft.Extensions.DependencyInjection.

In both cases, the experience is pretty crappy, because it takes about 60 seconds for the preview install window to open, because NuGet was trying to download the vulnerabilities database for the unavailable source. I forgot to test if disabling NuGetAudit makes the 60 second delay go away. I expect it should, which would make testing much quicker.

levicki commented 3 weeks ago

In both cases, the experience is pretty crappy

On this one we totally agree.

I updated to 17.10.3 today, I can test again with my solution tomorrow but I do remember not being able to install, update, and even remove existing packages.

Have you by any chance tried removing a package? I'd check both removing package from mapping which is inaccessible and the one which is accessible.

nkolev92 commented 3 weeks ago

A removal leads to a different graph, including potentially new packages, so that may hit the network as well.

When there is a mapping to an inaccessible source, I'd say it's normal to expect failures.

levicki commented 3 weeks ago

When there is a mapping to an inaccessible source, I'd say it's normal to expect failures.

What I expect is that it shouldn't fail if you are trying to add or remove from an accessible source.

For example, if I have a Newtonsoft.Json package installed form nuget.org which is accessible, removing it shouldn't fail irregardless of the status of other packages from other mapped but currently inaccessible sources.

zivkan commented 3 weeks ago

A concrete example would be very helpful, though I recognise that it could be very difficult to detail.

One way to try is when you are on your work's network, and therefore have access to all package sources, you can take a copy of your project's obj/project.assets.json file before uninstalling the package, then uninstall, and diff the project.assets.json before and after. Unfortunately, the assets file doesn't show the full graph, only the packages after versions have been selected, and inaccessible parts of the graph have been culled, but the assets file does detail each package's dependencies, so most of the package graph can be understood.

Since each PackageReference change is a single line change in an MSBuild file, another way to get useful information is to close the solution in VS, hand edit the PackageReference, then do a command line restore preferable getting a binlog (with the -bl CLI argument), or at normal or detailed verbosity. This will show all of Nuget's HTTP requests and responses (that don't fail). This will tell us precisely which packages NuGet was trying to access from the unavailable source, and therefore we can start trying to understand why.

For some background information, our docs on the dependency resolution algorithm, particularly the Cousin Dependencies section shows how NuGet chooses package versions when two different parts of the graph request different versions of the same package. However, what it doesn't describe is that NuGet will download all versions of the package into the global packages folder, so when you uninstall one of the direct packages, NuGet will already have the lower version of the transitive package and won't need to download it. Although NuGet does do early culling in the "direct dependency wins" rule, but this gets very difficult to talk about in the abstract, so evidence that this is happening would be more productive than trying to have a theoretical discussion about it.

So, excluding floating versions and NuGetAudit, I can't think of any scenarios where uninstalling a package should require communication with any package source, available or otherwise. Installing or upgrading a package should only require communication with the unavailable package source if the package being changed has dependencies on packages from the unavailable package source. But in this case NuGet needs to talk to the unavailable package source to avoid an objectively wrong package graph. There has been no discussion earlier in this issue about ignoring package dependencies from the unavailable source, or using the wrong version that is locally available, so I assume that's not what is being requested here. The discussion is only about packages from the unavailable source already being local, so the only package changes are from the available package source. My sample zip didn't demonstrate any problems in VS, only in dotnet add package.

If there's still a scenario, I think we need a reproduction to debug (or at least an actual package graph to discuss).

dotnet-policy-service[bot] commented 1 week ago

This issue has been automatically marked as stale because we have not received a response in 14 days. It will be closed if no further activity occurs within another 14 days of this comment.