NuGet / Home

Repo for NuGet Client issues
Other
1.5k stars 252 forks source link

No-Op scenario hole with floating/non-exact versions #5445

Open nkolev92 opened 7 years ago

nkolev92 commented 7 years ago

Floating/Open ended version requested - Scenario breaking change

Since the No-Op evaluation needs to be minimal, we avoiding checking the sources, so the case where a new version of the requested package is uploaded, or a version of the package is unlisted it might change the package resolution behavior.

Example:

<ItemGroup>
        <!-- ... -->
        <PackageReference Include="My.Awesome.Package" Version="3.6.*" />
    <!-- ... -->
</ItemGroup>

At the moment of the first restore, version="3.6.0" was the latest version on the package source. Sometime between that restore and the new restore, version="3.6.1" was uploaded to the source. Running a full restore would resolve 3.6.1 and downloaded that version of the package. However since the restore inputs have not changed, restore will think no action needs to be taken.

This might cause issues for scenarios like the following: A project(ProjX) that depends on a local feed that contains the latest bits of the desired projects (ProjA, ProjB etc). ProjA is built and the nupkgs from that dropped into the local feed for the ProjX. ProjX is restored and packed. The expectation is that the nupkg generated by projX has the latest ProjA nupkg that was built. However since the restore parameters have not changed, the restore of ProjX might no-op. The workaround is to simply use the force option for scenarios like this.

Potential improvements
Workarounds

Use -force selectively when you need to update the floating version. (dotnet.exe/nuget.exe) In Visual Studio, do a rebuild with restore on build enabled. That restore on build is equivalent to force. //cc @rohit21agrawal , @emgarten

mvput commented 6 years ago

Currently running into this issue. We are using pre-release packages during development. So project B has a dependency to project A with a floating version: 1.0.0-*. When there is a new version of A on the feed, Visual Studio doesn't always download the latest package. We are using dotnet restore --no-cache to resolve this issue.

Perhaps a solution is to create an option to ignore the cache like the --no-cache inside visual studio when building projects?

emgarten commented 6 years ago

Perhaps a solution is to create an option to ignore the cache like the --no-cache inside visual studio when building projects?

An option in VS would be useful to explicitly force a restore, this could match the --no-cache and --force behavior in dotnet.exe.

Floating versions are useful for keeping packages in sync when you are working across different solutions and repositories. If external projects are updated they get put on a feed, and automatically pulled in.

Unless you just went and made a change that you need and published the external projects you probably don't want to update. For example if you are developing in VS and are debugging a unit test, you don't want restore to quietly pull in new packages between builds/test running if someone else on your team updates something. It would be both slow and confusing.

Command line scenarios are similar in my opinion, and even on a CI machine it seems dangerous to have packages change between the start and end of a build, which could happen through implicit restore.

mvput commented 6 years ago

Yes that can be dangerous, what you want to achieve is while using floating versions you can update packages with the latest version but retain the floating version.

Another option might be to use the package manager to update packages, but to retain the floating version in the csproj file. An update now would change the csproj file to the exact version.

nkolev92 commented 6 years ago

@mvput I just updated the issue for future reference, but in VS there's already a way to "force" restore. If you do a rebuild with restore on build enabled, it will update the floating versions. There's no "--no-cache" option however, so it might take a while if the source of the updated version is across http.

gallivantor commented 5 years ago

This issue causes some real workflow problems when iterating rapidly over a prerelease package.

Would it not be reasonable for the no-op scenario to always check the feed for a newer version for any floating version references? While this would take a bit longer, it would at least produce the correct result, which is better than a fast process that produces the wrong result regularly

timotei commented 5 years ago

@nkolev92 How are you testing that rebuild scenario? For example, if I do rebuild, NuGet still caches the results from our NuGet server:

  CACHE http://artifactory.lan/artifactory/api/nuget/nuget/FindPackagesById()?id='tools-sample'&semVerLevel=2.0.0
  CACHE http://artifactory.ullink.lan/artifactory/api/nuget/nuget/FindPackagesById()?$skip=80&semVerLevel=2.0.0&id='tools-sample'

Even if I delete the bin/obj it still does that, and I don't see any other way to stop it from caching between rebuilds

PS: This is a .NET framework 4.7 project, using the PackageReference format to specify a wildcard version(i.e., `

1.0.0.1-*
</PackageReference>`

Wouldn't it be possible to automatically skip the caches at all on rebuilds?

nkolev92 commented 5 years ago

@timotei

This is about the no-op cache. The no-op cache is a collection of info that tells NuGet what were the dependencies this project last requested. It allows NuGet to do quick up-to-date checks to avoid hitting the network unnecessarily.

What you are seeing the http cache. That's a cache that expires every 30 minutes.

Wouldn't it be possible to automatically skip the caches at all on rebuilds?

Personally, I don't think that's a good idea. I don't even think that rebuild should always be a full restore evaluation. That would lead to a much slower and non-repeatable build. The fact that rebuild does a full restore(rather than quick up to date check) is very much hidden right now. As floating versions are not the majority of users, eliminating the http caches completely just during rebuild is likely to introduce more noise.

Related: https://github.com/NuGet/Home/issues/3116 https://github.com/NuGet/Home/issues/3389 https://github.com/NuGet/Home/issues/7198 https://github.com/NuGet/Home/issues/6987

chipplyman commented 1 year ago

So what are the expected steps to support a workflow like: a) specify a floating version in a PackageReference b) publish a new package that matches that version (to a local filesystem feed) c) build in VS and consume the updated package

Rebuilding is a terribly inefficient UX for this workflow, especially when iterating on a leafy project in a big solution with lots of dependencies.

I think at minimum the no-op cache should not apply to a local filesystem feed.

nkolev92 commented 1 year ago

@chipplyman

You might be able to run /p:RestoreForce="true" or --force from the commandline as the cheapest option right now.

I agree, rebuilding is not great for that purpose.

I think at minimum the no-op cache should not apply to a local filesystem feed.

An interesting tidbit here is that local feeds can often be quite a slower than http feeds, especially ones like nuget.org.

chipplyman commented 1 year ago

The use case is building within Visual Studio - any of several solutions across any number of branches, consuming packages published by a single common sln. A command line parameter is simply not an option for this use case.

Our local feed never has more than a dozen packages, each with only a small number of versions for local iteration. Enumerating a couple dozen directories is faster than a socket handshake with nuget.org let alone query processing and transfer time.

chipplyman commented 1 year ago

Our ultimate solution is going to be getting rid of wildcard versions, but we can't do that until we convert our legacy projects to sdk style due to the VS bug where the legacy project import cache does not get invalidated when the (.props/.targets) file on disk changes.

nkolev92 commented 1 year ago

nuget.org let alone query processing and transfer time.

The V3 protocol is static. For most servers, there's no compute in downloading packages.

I'm guessing you're familiar with it, but people have found upgrade assistant helpful when moving to .NET SDK based projects.

chipplyman commented 1 year ago

Upgrade assistant is for upgrading from older to newer versions of .NET. We need to first upgrade from legacy to sdk-style projects while remaining on .net framework, then we need to have our common projects multi-target framework and .NET.

We might be able to use upgrade assistant for the hundreds of projects in several different ecosystems that all consume our common nuget packages, but we're prioritizing the sdk conversion while remaining on .net framework for those also, because the project import cache bug has such a heavy impact on our nuget consumption workflow (since we need to centralize nuget package version management in a .props file.)

Kazpers commented 1 year ago

This problem is making wildcard packages more or less useless. A good reason no one is using them is that they generate unreliable workflows.

I'd rather have a slow workflow that is reliable than a fast workflow that is unreliable (hence useless).

This NEEDS a way to force a no-cache call in the VS UI at the very least for those devs that don't like using dotnet command line.

Ideally, it would always check versions for any dependency using a wildcard as we've explicitly chosen that those are fast changing, or at the very least allow us to update them using the UI without destroying the wildcard in the csproj file.

As it is we are very close to the same conclusion as chipplyman - wildcards are useless and the solution is to stop using them. This is horrible because the experience of having to manually update internal fast-moving packages is ALSO horrendous. At least it works though.

chipplyman commented 1 year ago

Right now I'm working on implementing a feature in our adding using a FileSystemWatcher on the local feed to manually invalidate the cache when the feed updates.

I'm currently weighing silent deletion of project.assets.json for each project vs prompting the user to perform a force restore.