dotnet / standard

This repo is building the .NET Standard
3.07k stars 428 forks source link

Confusion around the netstandard.dll #146

Closed kimbell closed 7 years ago

kimbell commented 7 years ago

I’ve been following @terrajobst YouTube videos, and particularly the one on .net standard.

In this video you mention netstandard.dll, and that makes things a bit confusing for me.
Let’s say we are using MemoryStream from a .NET Core app targeting netstandard 2.0, and running on a Windows machine with latest framework installed.

Am I correct in the following:

  1. My custom assembly is linked to netstandard.dll
  2. Things are then type forwarded to System.IO.dll
  3. System.IO.dll than type forwards things to mscorlib

When decompiling an assembly, you can easily see what other assemblies are referenced. How will this work in a netstandard.dll world? Will all I see be the netstandard.dll?

terrajobst commented 7 years ago

Hey @kimbell,

Here is the relevant part from the video.

Yes, when you compile an assembly against .NET Standard 2.x, your resulting assembly will (mostly) have references to netstandard.dll, as opposed to mscorlib or System.Runtime. So as far as .NET Standard 2.x is concerned, MemoryStream lives in netstandard.

When your library is consumed on a specific platform, say, .NET Core, there is a corresponding netstandard.dll which type forwards all the types from .NET Standard where they live in that particular .NET platform. For .NET Core, this means System.IO, on .NET Framework this means mscorlib.

When decompiling an assembly, you can easily see what other assemblies are referenced. How will this work in a netstandard.dll world? Will all I see be the netstandard.dll?

It will work the same way. It's just that when you compile against .NET Standard, the compiler will mostly only ever see netstandard.dll. You'll only see additional assemblies when you're adding references to components that are available for .NET Standard as extensions, such as the Windows registry or JSON.NET.

kimbell commented 7 years ago

OK, lets assume we remain in the .NET core context. In order to run the application, we need to package up all the assemblies; thinking of xcopy deployment. When you add a reference to .NET Standard 2.x nuget package, I'm assuming you also get reference to the System.IO nuget package and any other relevant packages?

Can we then consider the .NET Standard 2.x package as a 'meta package'? If I understand things correctly, it will bring in a bunch of assemblies you may not actually need for your application. Doesn't this go against the 'pay for play' idea for .NET core? Or is there some other build/packaging tool that figures out the minimum set of assemblies that we need?

One of the arguments for splitting things over multiple packages is that one can update them individually. Lets say MemoryStream gets a new method I'd like to use, but hasn't made it into the .NET Standard yet. Does type forwarding ignore assembly versions? Will it automatically use the latest one? If you add a reference to the new System.IO package, how does the compiler know what assembly to use?

terrajobst commented 7 years ago

OK, lets assume we remain in the .NET core context. In order to run the application, we need to package up all the assemblies; thinking of xcopy deployment. When you add a reference to .NET Standard 2.x nuget package, I'm assuming you also get reference to the System.IO nuget package and any other relevant packages?

Sort of. With .NET Core 1.x we've learned that a super granular package graph is causing more grief than it is worth. That's why we plan on collapsing the package graph. Basically, the idea is to flatten the meta packages for .NET Core 2.0 and .NET Standard 2.0, meaning you wouldn't need a reference for the System.IO package because the System.IO binary is provided by the .NET Core meta package.

Generally speaking, for XCOPY deployment you shouldn't have to worry about missing dependencies. The idea is that if you can consume the types at compile time, building the app will ensure you've the right artifacts in the output folder.

Can we then consider the .NET Standard 2.x package as a 'meta package'?

Yes. But we're planning on going even a step further and will remove the requirement from the project to include the meta package itself -- it will come from the built-in targets and is implied by the combination of the SDK and TFM. In other words, you can basically assume the platform to be "just there".

One of the arguments for splitting things over multiple packages is that one can update them individually.

Agreed, but we've learned that it doesn't quite work for the platform layer as you often have to update several packages in combination in order to get to a sane state.

Lets say MemoryStream gets a new method I'd like to use, but hasn't made it into the .NET Standard yet.

That's a completely different problem, which has to do with how fast .NET Standard can evolve. Generally speaking, the thinking is that new APIs will likely come to .NET Core first, because that's the platform whose BCL has the biggest community momentum. From there, it will flow to other platforms (probably Mono and Xamarin first, then .NET Framework). We'll definitively not wait until all .NET platforms have the new APIs before revving the .NET Standard, but on the other hand you need at least two platforms in order to make an updated .NET Standard useful.

Does type forwarding ignore assembly versions? Will it automatically use the latest one? If you add a reference to the new System.IO package, how does the compiler know what assembly to use?

Remember that .NET Standard is just a spec. In general, creative use of type forwarding or tricks in assembly factoring doesn't allow us to add members to types that are part of a .NET platform. But to answer your question: no, type forwarding doesn't ignore assembly versions. The type forwarder uses a versioned assembly reference. These references are resolved like any other assembly references and are thus subject to unification behavior provided by the .NET runtime (such as binding redirects, host policy, GAC etc).

kimbell commented 7 years ago

In the current .NET Core project files, we can reference .NETCoreApp 1.x (EXE) or .NETStandard 1.x (DLL).

Since type forwarding is bound to a specific assembly version, selecting .NET Standard 2.x locks me to a specific set of assembly versions. If I want to leverage an API that is not part of the standard, I need to choose something else. For EXE projects, I imagine a .NETCoreApp 2.x will become available, but what about DLL's? Will there be a .NETCoreDll 2.x that we haven't seen yet?

You mention new API's probably coming to .NET Core first, then later to .NET Framework. If you run a .NET Core app on a windows machine with .NET Framework, aren't most of the types in the end forwarded to mscorlib? Or am I missing something? I take it I won't be having a reference to netstandard.dll, but something else. Do we then need a reference to an assembly that contains an implementation for some types and forwards the rest to mscorlib?

mellinoe commented 7 years ago

".NET Core App" (or netcoreapp) is just the NuGet moniker we've given to the platform itself, it doesn't have anything to do with the type of assembly you are building (e.g. whether or not it has an entrypoint), even though it has the word "app" in it. Projects targeting .NET Core will use the netcoreapp moniker regardless of whether they are a library or application.

kimbell commented 7 years ago

That's interesting, but still a bit confusing. Why not just use 'netcore'?

In VS2017 RC, selecting a class library project seems to be locked to .NET Standard, whilst console apps, test projects and web apps are .NET Core. For a class library i created, I cannot select .NETCoreApp 1.x. as the target framework. Is this a change that is coming, but hasn't made it into the published bits yet? This class library was created in VS2017 and not upgraded from a previous version.

I'm currently running 1.0.0-preview5-004384

akoeplinger commented 7 years ago

If you run a .NET Core app on a windows machine with .NET Framework, aren't most of the types in the end forwarded to mscorlib?

No, .NET Core is a separate product in this case and doesn't rely on .NET Framework (that's why it also works on Windows Nano Server which doesn't have .NET Framework) so there's no forwarding from a .NET Core app to .NET Framework libraries.

I take it I won't be having a reference to netstandard.dll, but something else.

Correct, if you target netcoreapp then you won't get a netstandard.dll reference, instead you get the API surface area of netcoreapp and can use APIs that were only introduced there at the expense of not being able to run on other .NET Standard platforms.

kimbell commented 7 years ago

Writing software for specific .NET Standard version is one thing; this allows it to work on multiple platforms. How a given platform chooses to implement the standard is a different matter.

If you decompile C:\Users....nuget\packages\System.IO\4.4.0-beta-24903-02\lib\net461\System.IO.dll (which I believe is part of .NET Core), all the types are forwarded to mscorlib.dll.

If you decompile C:\Users....nuget\packages\System.IO\4.4.0-beta-24903-02\ref\netcoreapp1.1\System.IO.dll, types are forwarded to a System.Runtime.dll reference assembly.

Am I barking up the wrong tree? Am I in the right forest?

akoeplinger commented 7 years ago

The lib\net461\System.IO.dll assembly is specifically for when you reference the System.IO nuget package in an app/library targeting .NET Framework 4.6.1, i.e. the full framework. In that case, yes, the types are forwarded to mscorlib.dll since you're (supposed to be) running on the .NET Framework.

The System.IO nuget package is not strictly related to .NET Core, it "works" on regular .NET Framework too.

Aside: The ref folder is a new concept that only newer NuGet v3 based systems understand: https://docs.nuget.org/ndocs/create-packages/project.json-and-uwp#ref. Assemblies in the ref folder are used during compilation, assemblies in the lib folder are used during runtime.

kimbell commented 7 years ago

Thanks @akoeplinger for the useful information about the ref folder.

On my machine, System.IO only contains a dll under lib\net462. This is the full framework. You mentioned that .NET Core applications should be able to run on Nano server. What dll will then be used? I can't find a target for Nano server. Unix and Mac I get are not present since that is a different world, but Nano is still based on Windows.

@terrajobst wrote

Basically, the idea is to flatten the meta packages for .NET Core 2.0 and .NET Standard 2.0, meaning you wouldn't need a reference for the System.IO package because the System.IO binary is provided by the .NET Core meta package.

So when I add a reference to the .NET Core meta package, I get an implicit reference to a specific version of System.IO.dll. Correct? Or is this namespace provided in a different assembly, as is the case with mscorlib.dll on a full .NET Framework.

Assuming System.IO.dll will exist as a separate dll and nuget package in the 2.0 time frame. For my .NET Core application, I have a reference through the Meta Package, but want to use some new functionality available in a newer version. Do I add an explicit reference to System.IO nuget? With two different versions, how does the compiler know which to use? Or will such a case in practice lead to a newer version of the .NET Core meta package?

terrajobst commented 7 years ago

@kimbell:

Since type forwarding is bound to a specific assembly version, selecting .NET Standard 2.x locks me to a specific set of assembly versions. If I want to leverage an API that is not part of the standard, I need to choose something else. For EXE projects, I imagine a .NETCoreApp 2.x will become available, but what about DLL's? Will there be a .NETCoreDll 2.x that we haven't seen yet?

I think I now understand the confusion. Here is how to think about this:

In the upcoming developer experience for .NET Core and .NET Standard library projects, you'll be able to change a project from targeting .NET Standard to .NET Core (and vice versa) by changing the TFM in the project file. For example you can go from .NET Standard:

<Project Sdk="Microsoft.NET.Sdk" ToolsVersion="15.0">
  <PropertyGroup>
    <TargetFramework>netstandard20</TargetFramework>
  </PropertyGroup>
</Project

to .NET Core:

<Project Sdk="Microsoft.NET.Sdk" ToolsVersion="15.0">
  <PropertyGroup>
    <TargetFramework>netcoreapp20</TargetFramework>
  </PropertyGroup>
</Project

(the actual syntax and project file might still change, but you get the idea)

Libraries should generally target .NET Standard as this ensures that they can be consumed by any app. There will be circumstances where you need to access .NET Core specific APIs, either because the API is new and not implemented anywhere else, or the concept is .NET Core only. That's why I believe we should make it easy to retarget between .NET Standard and .NET Core so that developers never have to fear being "locked in". Start with NET Standard and retarget if necessary & revert back once a new version of the standard is available that has all the APIs you need.

Also, we plan on making it easier to cross-compile, meaning you'll be able to compile a given project multiple teams, for different TFMs. This way you can provide a mostly portable implementation and light-up platform specific features using #if.

Does this help?

kimbell commented 7 years ago

Things are starting to get clearer :)

Based on the information provided by @terrajobst

Basically, the idea is to flatten the meta packages for .NET Core 2.0 and .NET Standard 2.0, meaning you wouldn't need a reference for the System.IO package because the System.IO binary is provided by the .NET Core meta package

can I assume that you will never release an updated version of System.IO unless it's part of a new netcoreapp2x TFM? i.e. .NET Core will be spread over multiple assemblies, but released as a single package?

terrajobst commented 7 years ago

can I assume that you will never release an updated version of System.IO unless it's part of a new netcoreapp2x TFM?

Pretty much .The current plan is to no longer provide and thus update granular packages like System.IO. We will probably still have to ship updated versions of the meta package. And, of course, we'll continue to ship "out-of-band" packages that extend the platform. These are more fine grained then the platform, but that's OK.

kimbell commented 7 years ago

Thank you for an interesting and enlightening discussion.

terrajobst commented 7 years ago

Absolutely! Thanks for engaging!