dotnet / fsharp

The F# compiler, F# core library, F# language service, and F# tooling integration for Visual Studio
https://dotnet.microsoft.com/languages/fsharp
MIT License
3.89k stars 783 forks source link

Updating the dotnet cli or dotnet SDK - Automagically selects a higher version of FSharp.Core #4298

Closed KevinRansom closed 6 years ago

KevinRansom commented 6 years ago

4263 Highlights an interesting ... perhaps undesirable build time assembly resolution behavior in netsdk projects.

Back in the day ... VS 2013 -VS2017 F# Desktop project templates, specified the version of FSharp.Core.dll to use in the build. By setting the property:

    <TargetFrameworkVersion>v4.6.1</TargetFrameworkVersion>

This was great because projects could be round tripped and would work the same through a variety of Visual Studio versions.

Dotnetsdk projects in VS do not have the same protection and will select the higher version of FSharp.Core dependent on VS version.

The F# DotnetSdk has a property that performs a similar function to TargetFrameworkVersion, I propose that we update the dotnet sdk templates in the dotnet cli and Visual Studio to contain this setting.

The template would use semantic versioning to pick F# core, I.e. the highest locatable in the range.

So a new F# ClassLibrary project will look like:

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <TargetFramework>netstandard2.0</TargetFramework>
    <FSharpCoreImplicitPackageVersion>4.3.*</FSharpCoreImplicitPackageVersion>
  </PropertyGroup>

  <ItemGroup>
    <Compile Include="Library.fs" />
  </ItemGroup>
</Project>

//cc @dsyme, @cartermp @nguerrera @brettfo @TIHan

cartermp commented 6 years ago

I'm not opposed to this, though I wonder if we should just do away with the implicit FSharp.Core package reference and reference it explicitly if there isn't another way around this problem.

KevinRansom commented 6 years ago

The question is “What do we want to achieve”?

Back in the day, we wanted developers working on teams with mixed versions of VS (I.e 2013 and 2015) to be able to collaborate on projects and solutions, and produce the same results independent of versions of VS. At that time developers were in general, much slower to adopt the bleeding edge release, I recall in the VS 2013 timeframe hearing about organisations still using VS 2010. That is probably an old fashioned view of the world, today we are finding that developers are updating to the latest version of VS more quickly than ever before, Which would indicate that teams with widely variant mixes of VS will be less common than they used to be. The dotnet cli and VS update on a different cadence, adding that into the mix somehow …

The current behavior of the dotnet F# compiler is more in-tune with the modern view of let’s get everyone on to the latest version of everything as quickly as possible, and that the let’s be careful and deliberate approach of times past is well … kind of your grandfather’s process … clearly I am really reallt young.

The issue you saw with your solution is because there is a tension between the desktop templates approach to fsharp core versioning and the netsdk fsharp.core versioning approach that causes one to flow up, and the other to pin down, Alongside a somewhat arbitrary choice in how autogeneratebinding redirects work. If your app had been netsdk and the library a legacy project it would have worked fine.

So … before we worry about what implementation we prefer, we need to understand what approach we would like to take. We can carry forward the historic approach of FSharpTemplates, where the template specifies the actual fsharp core version, or we can move to the approach where the build tools picks the version, unless the developer has overridden it.

Personally, I think both have merit, and at the end of the day, preference of one over the other is pretty much a judgement call.

Kevin

cartermp commented 6 years ago

I think I'd be in favor of the build tools picking the version with an optional developer override.

KevinRansom commented 6 years ago

That is the current position of the netsdk F# projects. Now ... what should we do with the desktop project templates? Should they have that same modern feel, or should they remain with the current approach of picking a version on template instantiation?

Consistency would argue that we make them the same as netsdk. However, that would be a behaviour change for developers to learn about.

dsyme commented 6 years ago

Why would we auto upgrade Fsharp.Core and nothing else? If it's implicit won't that always cause problems for any consumer who makes it explicit?

We've had a lot of feedback that being implicit about Fsharp.Core or treating it differently to other libraries causes bugs and confusion....

Why not just make it explicit like any other library and avoid issues like this?

cartermp commented 6 years ago

@KevinRansom

Now ... what should we do with the desktop project templates?

Sorry if I wasn't clear. I'm in favor of:

KevinRansom commented 6 years ago

Perhaps our terms are mixed up. For the purposes of this the template creation is not really build tools.

The scenario, I have in mind is what should happen if I create a project on dotnet cli 2.0 and build and deploy with dotnet cli 2.5 (assuming such a thing is ever released.

I believe that you prefer that the template chooses 4.4.1.0 as the specific fsharp.core version to be used, until the project is manually edited by the developer to pick 4.5.1.0 of fsharp.core.

That is retain the existing behavior of the desktop templates and use it also on dotnet sdk projects.

Did I get that right?

Thanks

Kevin

KevinRansom commented 6 years ago

@dsyme,

Issues like what?

The specific reported issue is that the two project files have different approaches to FSharp.Core selection, one picks a specific version and the other doesn't. Effectively the issue is actually caused specifically because one of the projects is specific in it's FSharp.Core selection. If both said used the highest available then there would be no problem, or if both said use FSharp.Core 4.4.1.0 then there would also be no problem. However, if the exe project said use 4.4.1.0 and the library said use 4.4.3.0 there would still be a problem.

It is certainly a valid approach to say that the developer must always specify the exact version of any assembly she needs to reference. It is just as valid for a developer to say ... yeah I trust these folks to get the compatibility right, float me to the highest available. The one that makes the most sense to me is actually, Get me at least version X because I need it otherwise let the build and packaging pick what will work.


The reason that we might consider FSharp.Core different from 3rd party libraries is because it is the compiler run time library, and providers support to the compiler for specific language features.

For example: I as a developer may upgrade my compiler from F# 4.0 to F# 4.1 in order to make use of struct tuples. I would now have to specifically update FSharp.Core to 4.4.1.0 or better for every project I have as well as use later build tools in order to achieve that, even if I explicitly reference System.ValueTuple.dll.

C# developers do not have that burden, they grab the build tools that match their needs and party on. Well ... not quite, they do need to ensure they are running on net47 or reference System.ValueTuple.dll. However, that is insufficient for FSharp because of FSharp.Core


Now as a developer of long-lived slowly evolving systems, I may well want to pin my library versions to specific tested versions including FSharp.Core to a specific lower than the latest released version. Is it the right thing to do to require manual update for each existing project on each new toolset release or to enable pinning for those developers who prefer that approach?


By way of historical reference, the desktop project approach, was specifically introduced to enable project round tripping in development environments that had not standardized on the latest version of Visual Studio. Which was a specific business goal aimed at improving collaboration within project teams eliminating the need for synchronized VS updating.

Please note: I am discussing options rather than validating a specific approach.

cartermp commented 6 years ago

@KevinRansom No, you've got it backward. I think the .NET Core SDK way of doing things should be done by desktop templates. I also feel that we should just do away with the implicit package reference, and just make the <PackageReference Include="FSharp.Core" Version="4.X.*" />, where X is the most recently-released minor version of the FSharp.Core package, part of the .NET Core SDK templates.

KevinRansom commented 6 years ago

@cartermp.

  1. The current way the F# net sdk tooling works is that the referenced version of FSharp Core is determined by the build tools. (That is the project file does not say anything about the version of FSharp.Core unless the developer specifically expresses a preference by specifying a version.)

  2. Having the template write the version value, even "4.2." or "4.3." has a different outcome from the current implicit reference.

  3. Having desktop templates and netsdk templates operate the same way seems like a good way to go, after all developers won't have to learn different and distinct behaviours. Indeed having them use nuget packages rather than VS deployed assemblies is also good. It won't eliminate our need to deploy assemblies unfortunately, without a project upgrade wizardy thing but hey ... can't have everything.

Here is a scenario for you, what should happen.

  1. A developer installs VS 2017.65 which uses the FSHarp.Core 4.5.6 nugget package
  2. She creates a new command line app and adds in it's logic, runs, tests debugs
  3. She gets a yellow flag update notification and installs VS 2017.66 which uses FSharp.Core 4.6.0 nuget packagebecause we added APIs. 4 She updates her new command line app logic to add features and runs, tests debugs.
  4. She decides to add a new project to the solution which contains an F# class library.
  5. She adds in her new logic and wires it up to her existing app. Questions:
  6. Should the solution continue to build?
  7. Should the application continue to run allowing her to successfully access the new logic from the class library?
cartermp commented 6 years ago

FYI - @KevinRansom and I chatted about this at length in the office. We're aligned on having the build tools pick the assembly version for you, and the fixes to allow this to all work for the bug demonstrated in #4263 and this VSFeedback issue are something we can fix for VS 15.7.

Due to template maintenance hell and ensuring a good experience for developers who aren't using Paket to control FSharp.Core manually across large solutions, we also agree that the implicit package reference is still the best route.

There is still an open question as to if this implicit reference uses a splat string (e.g., 4.3.*, or is pinned to a specific version). A splat string allows us to only update templates when we do a minor version bump of FSharp.Core, and will pick up any patch level updates. However, it will slow down project creation as NuGet will be forced to do a check.

I currently favor pinning the implicit FSharp.Core reference to a shipped version.

KevinRansom commented 6 years ago

So there are a number of issues that are related:

The changes detailed here are targetted at Visual Studio 2017.7

Todo:

1. Move to explicit selection of the FSharp.Core nuget package. In the current dotnet sdk we use a wild card selection of the FSharp.Core nuget package. This has a huge impact on dotnet restore timing, requiring a network hop to ensure that the current highest local package is still the highest package.

2. Add FSharp.Core nuget packge to the dotnet SDK off-line cache. This ensures that creating a new project on a plane actually works.

3. Amend desktop project templates to use nuget packages, use the same mechanism as dotnet sdk to pick the nuget package version.

4. Ensure that VS deploys the correct FSharp.Core nuget package for the F# desktop templates. To enable offline VS operation.

5. The Manage nuget package tool built into VS does not work for FSharp.Core dotnet SDK projects and the desktop templates. Ensure that the manage nuget package tool works correctly for FSharp.Core in both. Ensure that existing projects continue to work as they do today.

How to Manage FSharp.Core versions in your solutions.

Ideally let it float ... in general for applications use the latest version shipped version. If there are compatability problems, they will be bugs and we will strive to ensure they are fixed quickly.

For class libraries generally use the latest shipped version, however, if a library is targetting a tool that ships with an earlier version of fsharp.core then use the manage nuget package feature to select the specific version targetted for that library.

Having used the manage nuget package the project file will look similar to this:

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Library</OutputType>
    <TargetFramework>netcoreapp2.0</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
   <!-- List of source files -->
  </ItemGroup>

  <ItemGroup>
    <PackageReference Update="FSharp.Core" Version="4.2.2" />
  </ItemGroup>

</Project>

A free floating project file (the default) will look similar to this:

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Library</OutputType>
    <TargetFramework>netcoreapp2.0</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
   <!-- List of source files -->
  </ItemGroup>

</Project>
dsyme commented 6 years ago

By long habit the first thing I do to any F# class library project file is make sure the FSharp.Core reference is explicit or managed by paket :)

If the SDKs are to decide on an implicit version, then won't we have situations where different SDKs give different interpretations of the same project file? So

I just don't like it when upgrading an SDK or Mono changes the results of compiling a library implicitly.

This would be the case even if the F# compiler is otherwise functionally equivalent for the code you're compiling, which is normally the case.

Today I developed on 4 machines - two CI systems and one VS4Mac and one Windows. How would I know what FSharp.Core dependency the library I was making had? Only by making it explicit...

This matters more for libraries than applications. A library binary becomes unusable in certain contexts if its dependency is magically too high. The case for implicit dependencies is more compelling for applications. Getting dependencies right is crucial for libraries.

If you go with an implicit reference for libraries, perhaps leave commented out text in the project file showing how to set an explicit version. We don't want programmers to need to use click-click IDE nuget tooling to learn how to go explicit?

forki commented 6 years ago

I never understood that desire for implicit fsharp.core. It brought so much pain to the ecosystem over the years. Every newcomer learns it the hard way. Let's make it explicit. But it feels like I repeat that for nearly a decade now. Gave up.

KevinRansom commented 6 years ago

@dsyme ,

The mechanism to specify a particular fsharp core will look like :

  <ItemGroup>
    <PackageReference Update="FSharp.Core" Version="4.3.3" />
  </ItemGroup>

Which they could type in or use the manage package reference tool to set:

The F# specific variables will still work but no longer be necessary.

image

Hopefully they will find the manage package reference tool sufficient to their needs.

Or they could use paket if they are so inclined. They will have plenty of choices ...

dsyme commented 6 years ago

Let's look at this from the perspective of the Microsoft Xamarin library templates. For those, there is no way we would use an implicit FSharp.Core version, because we couldn't afford to re-QA the templates on every update to tooling (VSforMac, .NET SDK, VS, Mono, CI systems). Updates potentially affecting FSharp.Core version numbers would become risky.

But if Xamarin library templates wouldn't use implicit versioning, then why should user library template packs such as those used in SAFE stack or Fable? And if user templates don't use it, then why should .NET SDK templates?

With regard to CI: systems such as appveyor are often used to build final NuGet packages, but the tooling version is only specified by a simple name such as "Visual Studio 2017" . With implicit references for libraries this means the FSharp.Core chosen for your library's dependencies could change from day to day according to AppVeyor's versioning. That doesn't seem like the default behaviour we should be encouraging for library writers.

Hopefully they will find the manage package reference tool sufficient to their needs.

We can't assume that people use VS to do F# development (e.g. VSforMac, Rider, VSCode, vim, ...), and the F# .NET SDK templates are non-VS components.

isaacabraham commented 6 years ago

I hope I've misunderstood this... please put an explicit version of FSharp Core into all projects created. I have lost count of the hours lost with stuff like this, and I thought that we're past this - go with an explicit reference.

@cartermp template maintenance hell is less important IMHO than the impact on developers using those templates. I would rather that the templates fix to a version of FSharp Core which is "known good" to work with whatever else.

dsyme commented 6 years ago

I just checked and an implicit reference to FSharp.Core in a new-style .NET SDK 2.0 project can't be upgraded/added using the VSforMac nuget dialogs. You'd have to edit the project file by hand in that case. It looks like this and the only option is "refresh"

image

dsyme commented 6 years ago

I presume upgrading VSforMac would magically change that to 4.3.3

cartermp commented 6 years ago

@isaacabraham I think you're misunderstanding my position here.

template maintenance hell is less important IMHO than the impact on developers using those templates. I would rather that the templates fix to a version of FSharp Core which is "known good" to work with whatever else.

I'm proposing that the templates are fixed to a version of FSharp.Core and do not use a floating version number. This is because of a performance penalty with a floating version number and NuGet. What I'm also proposing is that the reference itself is implicit and handled by the build tools for desktop templates that ship in VS. This accomplishes two things:

  1. For larger solutions in an environment (VS, CI server, etc.), the source of truth is the version of the .NET Core SDK you build against.
  2. For us, we update a version in our targets files rather than versions across 3 separate repos (2 of which we don't own ... a consequence of being first-class for .NET Core).

The difference between this and what we do today with .NET Core SDK-based templates is that the version is pinned here, but today it's not.

@dsyme

Updates potentially affecting FSharp.Core version numbers would become risky.

Isn't that the case either way? That is, when a new .NET Core SDK is revved and Xamarin infrastructure chooses to take a dependency on it, any previous FSharp.Core version is not guaranteed to have been tested against that SDK version. Wouldn't re-QA be needed in either case?

We can't assume that people use VS to do F# development (e.g. VSforMac, Rider, VSCode, vim, ...), and the F# .NET SDK templates are non-VS components.

I don't think Kevin is saying that. There are already facilities today to disable or override the implicit reference in the .NET Core SDK today. Nothing about that would change.

0x53A commented 6 years ago

For larger solutions in an environment (VS, CI server, etc.), the source of truth is the version of the .NET Core SDK you build against.

I am 100% with @forki on this - the version should be explicit in the project file because that is the only way to keep it stable and sane.

Why do you think it is actually a good thing, that a simple update of the SDK also changes the referenced version automatically?

Imo that is one of the worse things that can happen - I probably have a later version of the SDK locally on my computer than on the build server, so it will work locally, but fail when compiled on the server.

Even the nuget team has realised that floating / unspecified versions are bad and has started to plan for a lockfile, that mistake shouldn't be repeated by each sub-project of the .net sdk.

Using a "normal" reference would also fix all issues with the nuget UI and other (maybe third-party) tools.

KevinRansom commented 6 years ago

But Lukas, no one is stopping you from adding a reference to a specific version when you want it, there is even tooling to help you? And the way you would add it the specific version looks just the same as you would do it for any other package reference.

When nuget adds their lock file, I am guessing tooling would be provided and we would operate well with that. I also wonder how enjoyable manually updating each project in a 10 project solution would be if we revved the fsharp.core version number more frequently than we do now, lets say monthly. And paket users will continue to use the mechanisms paket enables for managing package versions for solutions and projects.

Your scenarios are valid and indeed important, which is why we will ensure they are enabled using standard, tooled mechanisms. But they are not the only ones that matter.

Kevin

0x53A commented 6 years ago

Yes, I can solve my issues because I know about them. But I think the defaults as currently implemented, and as proposed here are only suitable for toy projects and actively dangerous for larger solutions.

Or to frame this another way:

If you need to do N steps to move a project from toy to real (whatever your definition of real is), why not make that the default instead?

At this point I'm just reiterating points already made by forki and dsyme, so I will shut up and let you decide whatever you decide ;-)

cartermp commented 6 years ago

@0x53A This is an interesting point:

Why do you think it is actually a good thing, that a simple update of the SDK also changes the referenced version automatically?

This is a good thing because the version that you're using is the version that the newer SDK was tested against. There is no guarantee that a previous version was tested against the new SDK, nor that it will work, nor that a fix will be backported. There is, however, a guarantee that the newer FSharp.Core version works with the newer SDK version, and that a fix will be submitted if an issue is found.

The scenario you describe as potentially problematic (using a higher version of the SDK locally and a lower version of the SDK on a build server) doesn't appear to be any worse than managing your FSharp.Core versions yourself in your app. Both are susceptible to the general problem of, "I don't have the right version of things for what I need". If the answer to that is, "Use Paket", then we've automatically excluded the majority of F# developers in the world. We cannot take that position here. Is one scenario worse than the other? I honestly don't know.

As an aside, I don't feel too strongly about my proposed position here - take my statements here not as a rebuttal, but an acknowledgment that your feedback here matters. We haven't committed to a decision on this yet.

KevinRansom commented 6 years ago

@0x53A

This is why I shouldn't be allowed to work on Open Source projects.

If I understand your scenario correctly, you are trying to evaluate two different risks.

  1. The risk that a new release of FSharp.Core will break your project. or
  2. The risk that an updated compiler will break your project.

From your comments it seems that you have evaluated the risk of updating FSharp.Core as significantly higher than the risk of updating the FSharp compiler, so much so that you accept there being different versions of the compiler on your personal computer than on your CI.

I accept that, up until now the most likely reason for an F# project to fail to build, or successfully run has more often been because of an FSharp.Core misconfiguration than due to a breaking change in the compiler. However, I would assert that it usually happens because the projects are misconfigured, or they are using type providers (which at least in the past introduced vast FSharp.Core reference errors.), or apps based on FCS or FSI and the project or environment doesn't accurately describe it's dependencies.

Back in the day there was also a compiler bug, that caused TypeProviders to cause the compiler to generate two different references to FSharp.Core, but that's been fixed a long time now.

If you look at old style msbuild project files, references are accomplished by qualified assembly names, hint paths that are absolute, or relative to the project file, generated by tool (I.e. nuget or paket or template instantiation) or third party addin tooling. Then there are binding redirects, again, manual, computed and ... blah ... blah .... blah .... I doubt if it is even possible to manually manage a project using those tools. I suppose we currently do just that in the FSharp repo, and it is hard to get it right, and is very ... very fragile.

No wonder we have learned that updating a referenced library is the riskiest venture in the whole of software engineering. We want to get to the point point where it is no riskier to update FSharp.Core than it is to update the compiler, and if we can do that synchronously then there is only one decision to make "Should I update my tools"?.

To simplify things we currently use PackageReferences in the netsdk, and we aim to align our implementation with the existing nuget packagemanagement tooling to ensure that

We do agree that there are interesting scenarios where a developer will want to pick a specific version of FSharp.Core.dll regardless of build tools, perhaps when building an addin for an existing app that has yet to be upgraded and we want to enable her to be able to do that.

In the future we want to arrive, we do not regard explicit selection of the FSharp.Core version as the distinguishing feature between a real project and a toy project. The selection of a specific version of FSharp.Core should occur as a result of a decision based on a technical need. Of course we may not achieve that, but we should want to try.

KevinRansom commented 6 years ago

@dsyme Xamarin and VS both have the same issue, which is due to our targets, it is a bug we need to fix.

0x53A commented 6 years ago

I had a really long post typed up, which debated both the technical and the philosophical points of build systems, then I realized no one would want to read it. Here is the short version:

0) I can solve my issues, so I am debating the defaults for the (perceived) benefit of other users and the eco-system in general.

1) I fear neither updating the compiler, nor FSharp.Core, but I want to be in control, not have it be done implicit behind my back. I run paket update once or twice a Month to update all our dependencies, including compilers.

2) There is a big difference between updating a compiler and updating any lib (and FSharp.Core is just a lib like any other):

If I build an app, this is probably not too bad, I may end up with a newer FSharp.Core, but in the end, things will probably work.

If this is a lib then I force this same change on all my consumers.

So updating compilers is probably safe, updating libs should always be a conscious decision.

enricosada commented 6 years ago

There is still an open question as to if this implicit reference uses a splat string (e.g., 4.3.*, or is pinned to a specific version). A splat string allows us to only update templates when we do a minor version bump of FSharp.Core, and will pick up any patch level updates.

If you pin a specific version, you cannot fix it with a .patch release (fast) if you find a bug, but you wait a new sdk drop (slow). is really better? You can test the new .patch version against all release sdk .patch verison, is just a build matrix, but at least you are not stuck with a bug in a sdk you cannot fix (happened in the past when things where pinned at .patch level or bundled)

However, it will slow down project creation as NuGet will be forced to do a check.

Yes and no. Yes, if a new .patch version is released nuget will resolve that and download it. but after that is the global nuget cache. so is fast Also not all project are empty console app or empty lib. you need other packages usually, so you anyway need to check resolution. optimize for a really small scenario, is not the best usually

So, we have .minor and .patch, can be used for that. so using a floating version up to .patch (4.3.x) allow to:

fsharp.core is a package, while embedd the f# compiler make sense, also pin the fsharp.core lib (a thing is really easy to change) doesnt make lots of sense in a nuget packages world (fsharp.core is just one of the packages)

If you really really want to support that offline always scenario, add another property FSharpCoreImplicitBundledVersion , so ppl can override it like (if needed) as

<FSharpCoreImplicitPackageVersion>$(FSharpCoreImplicitBundledVersion)</FSharpCoreImplicitPackageVersion>

and that FSharpCoreImplicitBundledVersion is a prop inside the sdk, so pinned to exact version in offline cache

enricosada commented 6 years ago

About implicit vs explicit PackageReference in templates, ihmo implicit is better for templates. while i dont like it a lot, it helps.

so, fsharp.core is going to float anyway, and is resolved as specific version. let's accept that, and leverage major.minor.patch for this

maintain parity between fsc verison and fsharp.core version is really hard to obtain in real projects, because we specify range for packages, so we are going to use a different version (and minus unsupported features like structs) and compiler should happy accept a different version of fsharp.core

dsyme commented 6 years ago

@cartermp Re this:

This is a good thing because the version that you're using is the version that the newer SDK was tested against. There is no guarantee that a previous version was tested against the new SDK, nor that it will work, nor that a fix will be backported. There is, however, a guarantee that the newer FSharp.Core version works with the newer SDK version, and that a fix will be submitted if an issue is found.

Personally I think we need to CI test the F# compiler that ships with the .NET SDK against a matrix of back-versions of FSharp.Core. That can be simply running and testing a small selection of projects that use an explicit reference to FSharp.Core nuget 4.0.0.1, 4.1.17, 4.2.3, 4.3.3 etc.

I would consider it a bug in the F# compiler if it failed to reference and compile against these DLLs/packages..

dsyme commented 6 years ago

What version range use? 4.? 4.3.? [4.3, 5)? for console app/lib doesnt really matter.

@enricosada I believe the proposal is to make the reference in the default "classlib" and other templates implicit, but pinned (for build performance reasons), see e.g. https://github.com/Microsoft/visualfsharp/pull/4327/files#diff-2cc8401da83ac111dbadcb8fb435075eR36

dsyme commented 6 years ago

@isaacabraham @0x53A Note that this discussion is really about

  1. whether an implicit reference to FSharp.Core is supported at all (Decision: implicit references to FSharp.Core are supported)
  2. if it is, does it float or is it pinned (Decision: it is pinned, for build performance reasons)
  3. if it is, is it the default in the .NET SDK F# library templates (Decision: it is)
  4. if it is, is it the default in .NET SDK F# application and testing templates (Decision: it is)
  5. if it is, does the visual nuget tooling work in VS and VS for Mac (Decision: it is being made to work)

I respect the decision to support them and understand it. At least the project files look nicer.

However, it is not compulsory nor even normal to use implicit references. I would personally advocate that the FSharp.Core reference should always made explicit in the following situations:

  1. in libraries published to nuget or libraries built with CI systems
  2. in templates that are not part of the .NET SDK
  3. for any projects using Paket
enricosada commented 6 years ago

@enricosada I believe the proposal is to make the reference in the default "classlib" and other templates implicit, but pinned (for build performance reasons), see e.g. https://github.com/Microsoft/visualfsharp/pull/4327/files#diff-2cc8401da83ac111dbadcb8fb435075eR36

while pinned is ok, performance change really that much? just resolve of packages change, but again, is not like ffsharp.core is going to be the only package used, so anyway we need to resolve others in normal use cases (minus hello world samples). And after first restore, the newer FSharp.Core package is anyway is the nuget global cache, so is not going to be downloaded another time.

if bundled is 4.3.4, and for 4.3* is the latest version, good, that one will be used. otherwise if exists 4.3.5 is better use that (is a .patch version for a reason)

summarized:

What if an sdk with a pinneed 4.3.4 is released, and later we ffind a bug in 4.3.4?

cartermp commented 6 years ago

@enricosada regarding the floating range, there are two reasons why we'll have to pin it:

  1. A NuGet hop upon project creation in VS slows things down. We already have slow solution load times in VS and cannot slow that down further.
  2. Getting implicitly upgraded just by opening a project created in the past (since that will kick off a restore) likely drives some people mad. Although we feel that FSharp.Core is unique from other libraries in that it's a "Standard Library for F#", we do not feel that this standard library should be upgraded on your behalf without asking you. To @0x53A's point, someone created the project with certain capabilities in mind and can upgrade it to further capabilities when they so choose. #4327 adds support for upgrading with the NuGet UI, and if someone is using Paket then it's likely that they're already controlling their FSharp.Core reference. If someone is using neither (this population of people is bound to be tiny), then they'll have to override the reference in the project file and manually update it there - which I suspect they'll have already been doing.

As for bugs in FSharp.Core and not being able to get the fixed version immediately - this is likely the tradeoff that we'll have to take. Unfortunately, there's no good answer here.

enricosada commented 6 years ago

Unfortunately, there's no good answer here

that's for sure, not saying that. is a tradeoff. just want to note some things, for this discussion to your evaluation of the tradeoff.

A NuGet hop upon project creation in VS slows things down. We already have slow solution load times in VS and cannot slow that down further.

i am just sayng, nuget is not going to nuget.org anyway to resolve packages? that's the only hop. and in normal usage (with any packagereference added), well, is going there too.

atm is not slow in cli.

i am getting just a really small slowdown between f# and c# currenty

F#

E:\temp\b>dotnet new lib -n fs3 -lang f#
The template "Class library" was created successfully.

Processing post-creation actions...
Running 'dotnet restore' on fs3\fs3.fsproj...
  Restoring packages for E:\temp\b\fs3\fs3.fsproj...
  Generating MSBuild file E:\temp\b\fs3\obj\fs3.fsproj.nuget.g.props.
  Generating MSBuild file E:\temp\b\fs3\obj\fs3.fsproj.nuget.g.targets.
  Restore completed in 877,56 ms for E:\temp\b\fs3\fs3.fsproj.

Restore succeeded.

C#

E:\temp\b>dotnet new lib -n cs3
The template "Class library" was created successfully.

Processing post-creation actions...
Running 'dotnet restore' on cs3\cs3.csproj...
  Restoring packages for E:\temp\b\cs3\cs3.csproj...
  Generating MSBuild file E:\temp\b\cs3\obj\cs3.csproj.nuget.g.props.
  Generating MSBuild file E:\temp\b\cs3\obj\cs3.csproj.nuget.g.targets.
  Restore completed in 159,05 ms for E:\temp\b\cs3\cs3.csproj.

Restore succeeded.

so is 800ms (F#) vs 160ms (C#).

Take in consideration that a dotnet new mvc (C#) is 2.4 sec, and 2,51 sec (F#). Same for asp.net core empty.

Adding ANY package to a c# console app, a restore (needed after adding a package) is 820ms and remain stable for F# (840ms).

That said it make for a pretty big advantage of possibile bugfixed with a .patch version (who is a really nice tradeoff to consider)

About loading of F# solution, with ANY package added (like Newtonsoft.JSON), is goint to take the same time of C#. so good afaik.

Getting implicitly upgraded just by opening a project created in the past (since that will kick off a restore) likely drives some people mad

this happen normally in nuget world. there is no lock file, is not reccommended to commit obj/project.assets.json, and anyway restore is done implicit with lots of command (build, etc) who override that for up to date checks

If you open any csproj with a different .net core sdk version (maybe a 2.0.3 instead of 2.0.0), can use a different netstandard.library version, unless pinned in proj. not speaking about other deps like asp.net mvc

so is more a feature, that a drawback.

@cartermp sry for long post, just to add some data, because creating lot of projects (f# too), the new is not the slow part at all.

KevinRansom commented 6 years ago

@dsyme, @cartermp

I think we can consider closing this.

  1. The tools now pick a specific FSharp.Core nuget package instead of a wild-card.
  2. Tooling allows the developer to update the version of F# core directly, with no other properties.
  3. A developer can manually type in a package reference update to a specific version in the project file.
  4. Developers can specifically disable automagic referencing which will require them to add a specific FSharp.Core

Kevin