Closed ctaggart closed 5 years ago
@ctaggart I don't know if you encountered paket / generate-load-scripts?
https://fsprojects.github.io/Paket/paket-generate-load-scripts.html
Right now it doesn't leverage the new location for nuget, but maybe it will in future, when this is the case, this could be handled by having the generated scripts to load the assemblies in that location instead.
I'm not sure we should bake such functionality in compiler / fsi already, I think it opens door to many cans of worms, there was discussions for a #nuget
directive or a way to add assembly resolution handlers.
One problem I see with #package "name-version"
is that a set of scripts might end up using different versions / you need to maintain the version in many places consistently, there are questions about transitive dependencies, etc.
Nuget or Paket (as a tool used outside the script) I believe help to handle those problems.
I'd be favorable to this proposal if there was a clearly defined way to register custom resolution of those #package
directives.
I'd like to hear what @KevinRansom and @forki and others think about this
We need to do this in some form. I would like to keep the ability to integrate with Paket too, since I still believe it brings a lot of value. I would also allow packages to have their own load scripts as part of the package rather than just referencing the DLLs
I'm anxious not to tie ourselves too closely to aspects of packaging that might change, but I understand the huge value of being able to reference packages like this.
@ctaggart If you would like to prototype something as a fork of Microsoft/visualfsharp I'd be really glad to see it, just at least to get a feel for what the implementation would look like. It's probably not a huge amount off code to add if we're just referencing simple packages without transitive dependencies.
You will have to consider package sources besides nuget.org I suppose.
The new MSBuild format brings a lot more options to the table. Paket is already able to integrate into it. Maybe there is a leverage that
@dsyme, @ctaggart.
Yes the dotnetcli has an interesting way of managing references. And it is essential that we integrate FSI with it if we want a decent coreclr / fsi story.
I struggle a lot with figuring out what makes a good design for this though ... especially since dotnet cli, no longer does project json, but instead uses PackageReferences in a .XXproj file.
allowing #r to reference packages at run-time seems like an essential feature (it doesn't matter what the keyword looks like). However, there are a ton of issues with that ... E.g. what if you are running with .net 2.0 and using System.Runtime 4.1.0.0 and the package you reference depends on System.Runtime 4.1.1.0.
Do you:
Given that dotnet cli is dependent on msbuild, do we have a project file ... If we use a project file, then I expect it would be possible to plug packet in by munging target files, it really depends what role is desired for the tool.
If we don't want to go with a project then the plug point would need to be better developed. I have not followed packet in the nugget 3.0 world, so I don't know how it fits, others could speak to that.
The other thing is how do we want FSC to behave, the FSC command line on coreclr is very full and some of this stuff could simplify things.
I desperately want to make time to think this through ... and events keep getting in my way. Which makes me quite sad and very frustrated.
I must admit it feels we are very close to working answers for these things for Paket, it's just not integrated into FSI.EXE and the various editing tools. We already use a non-integrated manual two-phased version of it in Azure Notebooks, e.g. if you look at the samples here https://notebooks.azure.com/dsyme/libraries/fsharp-templates.
An FSI-integrated version of this would, at a logical level, keep an incremental paket.dependencies script. A complete script would look like this:
#r "paket: nuget FSharp.Data"
#r "paket: nuget XPlot.Plotly"
open XPlot.Plotly
Chart.Line [ 1 .. 10 ]
The spec would be that each line
#r "paket: some-text"
implicitly adds that text to a script-specific paket.dependencies (not an actual file, just implicit). Just like the existing #r
DLL references, these would, at design-time, be collected and resolved (using the existing Paket.Core API, which is very simple), prior to assembly resolution. As part of this we would regenerate the paket-files/load.fsx
script (that's the exact name of the script) and that would then be provided back to the compiler service to analyze for DLL references and for execution at runtime. The Paket framework
setting would default to whatever we are using by default in FSI.EXE e.g. net461
and the source
setting would default to nuget.org
unless otherwise specified.
The only really hard things are the decisions up to Paket: (a) where the package cache resides (b) if the packages are version qualified and (c) if the packages are collected when no longer used. For Paket the default is normally to install into the packages
directory under the script without version numbers in the paths, and for Paket to manage collection of the packages when no longer needed. Paket likely has settings to adjust these things and it would just be a matter of choosing the right default policy.
I do wonder if we (the F# community) should just forge ahead and trial adding this functionality directly to the Visual F# Tools, based on some fixed version of Paket via a a PR for people to try out. If we did I reckon we would iterate very quickly to something that is very usable for .NET Framework scripting (and after all 100% of F# scripting is .NET Fx today). We could also then iterate on .NET Core scripting.
This would, however, mean shipping a fixed version of Paket.Core.dll as part of the Visual F# Tools (really alongside FSI.EXE) for use in the F# scripting model. I don't think that would be a bad thing, but if we wanted to make that more explicit we could do this:
#r "/some/other/paket.dll"
#r "paket: nuget FSharp.Data"
#r "paket: nuget XPlot.Plotly"
open XPlot.Plotly
Chart.Line [ 1 .. 10 ]
where the first line loads the package manager and conforms to some interface. Subsequent lines would then pass off the information to the named package manager - like type providers this has to happen at design-time too, i.e. in the IDE. But the API would be fixed in stone.
It's possible the nuget
should be implicit for the common case, so
#r "paket: FSharp.Data"
#r "paket: XPlot.Plotly"
To be honest I'd love to see a prototype of this for the VIsual F# Tools, and if it got integrated I would use it all the time. I'm sure someone would add the IDE feature to give autocomplete for package names too on #r "paket: $$$"
, and we'd also see the feature integrated into FSharp.Compiler.Service and all of Ionide etc. If we started today and let the community ( @forki :) ) loose on it I reckon a full prototype would probably be done in.... well.. these guys are just so fast.....
The problem is that I feel like the dotnet
version of the story (which definitely needs to happen) is somehow holding us back - the churn last year in project.json means we're waiting for that story to stabilize, in conjunction with stabilizing F# .NET Core scripting and so on. Of course, we eventually need a full #r "dotnet: ..."
too and for both to sit alongside each other and be fully companionable. But I wonder if it might not be better to just forge ahead with a #r "paket: ..."
feature and use it as a forcing function to work out what the #r "dotnet: "
story looks like, while equally landing a great feature for F# and the Visual F# Tools
The problem is that I feel like the dotnet version of the story (which definitely needs to happen) is somehow holding us back - the churn last year in project.json means we're waiting for that story to stabilize, in conjunction with stabilizing F# .NET Core scripting and so on.
@forki will confirm, but the aim (and next paket release is there or almost there) is to have a "native-like" experience with paket on dotnetcore (integrated with dotnet restore
/ dotnet build
), also for now there is no scripting story in dotnetcore as the project.json/*proj files are for compiling / packaging assemblies, but not used in context of scripting.
As an aside, one can also imagine #r "npm: ..." being very useful in Fable scripting (i.e. for F# scripts compiled to JavaScript using Fable).
Scripting/interactive and dotnet needs to be sorted out. And it is not straightforward … although with dotnet 2.0 it may be considerably less grief filled than the current story. Partly the Windows FSharp interactive experience was always so permissive in an interactive sense with lots of help from FSI and msbuild that enabling a similar experience under coreclr may be quite tricky.
Especially since under dotnet dependencies are tracked more clearly.
Anyway … it is on my mind more or less constantly but it is terrifying tough to clear the decks enough to give it some substantial thought.
Kevin
@ctaggart going back to your suggestion, your main concern is relying on the shared packages
folder, rather than local packages
folder which often is a waste of space.
Assuming paket generate-load-scripts
would rely on the shared packages
folder, would that be satisfactory enough or you believe integration in the language (as @dsyme & others are sketching) is the way to go?
I think for that to happen, it would be necessary to have a cross platform way to resolve that shared packages
location, %userprofile%\.nuget\packages
seems Windows specific, and we currently can't bake that in #r
directives being generated.
There is also some discussion of using symlinks in Paket for the local packages folder, which would perhaps resolve some of the concern around copying files around. Discussion at fsprojects/paket#2141
going back to your suggestion, your main concern is relying on the shared packages folder, rather than local packages folder which often is a waste of space.
@ctaggart @smoothdeveloper I think we should treat this as orthogonal to the basic pressing need to add paket/dotnet/npm package manager support into the F# scripting model. Each of these package managers can, if they want, allow the specification of the policy w.r.t. package location, cross-script-sharing and collection. The defaults should also be up to each package manager.
There is no one "right" solution regarding sharing packages v. xcopy packages, there are just mechanisms, tradeoffs and defaults.
Seems to me like this begs for a FSI pragma plugin architecture. Could maybe be as simple as a function (single member interface) that takes in a string (parameters to the pragma) and an object that lets you feed commands to FSI as strings.
Forgive me if this is difficult due to the current architecture, it's been quite a while since I've played with the compiler service.
I think that editor part is pretty easy. I don't know Paket API well enough to make it work in memory (right now I depend on physical paket.dependencies
and paket.lock
) but with @forki's help I'm sure we can do it
[Obviously that's just super ugly hack]
Speaking of ugly hacks:
see https://github.com/Microsoft/visualfsharp/pull/2483 for WIP progress
Seems to me like this begs for a FSI pragma plugin architecture
@Rickasaurus maybe we just need to support uri in #r
, and add something to register additionals scheme handler. Extend should be done maybe invoking fsi.
functions?
paket: nuget: XPlot.Plotly
nuget: XPlot.Plotly/1.0.*
(resolved with nuget rules, like the PackageReferences
).\prova.dll
(without scheme), can be translated as file://./prova.dll
Doing the dotnet:
(please call it nuget:
) is not that big issue, an csproj/fsproj can be used to just resolve <PackageReferences
as quick hack.
@dsyme i see only two big issues:
With single packages (#r "paket: mypkg"
), works ok.
But all #r
are processed one after another.
There should be also something to resolve multiple packages together, because transitive deps between packages are not explicit (like dll), to resolve compatibile packages.
That is an issue with nuget <PackageReference
style too, because first all package ref are evaluated, and after real package version are resolved.
Maybe can be easier for medium term to:
fsi
variabiale, so fsi.packageManager
is paket handler#r "nupkg: <string>"
alias to fsi.packageManager.Resolve("<string>")
but is also possibile to do
fsi.packageManager.Resolve
[ "nuget: NETStandard = 1.6"
"nuget: Newtonsoft.Json ~ 9.0"
"nuget: Plotly" ]
ok, netstandard2.0 is going to help to have shared bcl api.
but is just an api specification, based on package.
Maybe i want to use the full .net core api, or the full net46 api in the script. That's not going to help much. Also some packages will be compatibile only with netcore (to use specific functionalities, because innovation will be done in .net core runtime), same for .net full, or mono.
The csproj/fsproj is slim because implicit include the .NETStandard1.6
package, but dependency exists. but can be disabled to include other versions.
Same for runtime.
Maybe we should discuss that in another github issue.
regarding resolution of multiple packages. I already have that in the WIP PR. It will resolve on the first expression that is not #r. So go ahead and use as many packages as you want. It also references transitives via the generated load scripts. so that should work as well.
the #r "nupkg:
" alias to fsi.packageManager.Resolve(" ")
@enricosada In all this, it's very important to remember that we need to trigger package resolution, access generated package load scripts and access resolved DLLs at design time, i.e. when type checking the script in the editors, not just at runtime in FSI.EXE. This means that this stuff is static not just dynamic and that the #r
are static declarations like the existing #r
and #load
I'll work on a prototype of taking @forki's dynamic FSI.EXE work and playing it instead as part of the design time logic.
(Note, because package resolution can be lengthy and involve disk space etc. we may need to add a confirmation dialog in some editing tools before we download masses of packages. TBD )
I'll work on a prototype of taking @forki's dynamic FSI.EXE work and playing it instead as part of the design time logic.
My ugly hack injects additional -r: path/to/dll/resolved/by/paket.dll
to FSharpProjectOptions
after calling FCS' GetProjectOptionsFromScript
Ok now can just use the paket files that are already present in a repo:
If it can't find that then we fall back to temp path. Next step: locate the paket.exe in such a fallback situation
Taken from the PR and copied here...
Any reason why we just don't go for a #paket node rather than something explicitly coupled to #r? I suppose I could see #load being used in the same way (for Paket to read from GitHub etc.) but maybe putting it right at the top level could open up possibilities for adding features to F# scripts aside from just reference and load?
While I agree with most of what has been said already I'd like to throw some additional ideas around:
I made it so that it's already detecting local deps files
Back to @KevinRansom's question:
E.g. what if you are running with .net 2.0 and using System.Runtime 4.1.0.0 and the package you reference depends on System.Runtime 4.1.1.0.
Right now F# Interactive just redirects this kind of #r
to the loaded version, and MissingMethodExcception can happen.
If doing proper package resolution then you should surely just fail or warn, hopefully with a nice message that your script runner should run a later version. That's on the assumption that there's really not technical possibility to run to different versions of System.Runtime in the same process.
Alternativey you could have dotnet fsi
(or whatever the final way of doing F# scripting is on /NET Core) do the package resolution and pass the right arguments in prior to actually starting fsi.dll
. On .NET Framework you'd need to put a package resolver in front of fsi.exe, though the problem is less severe there since there are fewer DLLs and less version churn.
It's also absolutely critical that this stuff works at design time (so you don't need to run the script to get proper editing of the script against its resolved packages).
@matthid
It's not clear if this needs to be in the core. ...
I'm only interested in a language/tooling-integrated package referencing feature if
#r "paket: nuget FSharp.Data"
#r "paket: nuget XPlot.Plotly"
open XPlot.Plotly
Chart.Line [ 1 .. 10 ]
it works universally, i.e. it works on any default install of F# editing + scripting
it works the same way on both .NET Framework/Mono and .NET Core, at least to some approximation, and perhaps also Fable, though the sets of packages would differ in each case
These sorts of things are the "magic" that would really drive F# scripting a long way.
I'd like to see a set of design principles like this (in addition to the experiments :) )
@isaacabraham
Any reason why we just don't go for a #paket node rather than something explicitly coupled to #r?
We would almost certainly need to work out a "dotnet" solution (for both .NET Framework and .NET Core), and possibly also an "npm" solution (for Fable). It seems best to use one syntax to cover all future package management integrations.
I think we need something along the lines
#r "dotnet....
#r "nuget...
and this would defer to reference.dotnet.dll or reference.nuget.dll.
Of course the same could work with
#dotnet "...
#nuget "...
I think both would work and that's a language design decision. But the "final" design should try to keep this extensible.
@dsyme I agree with all your points, especially what you responded to @isaacabraham. I considered the fake experiment as a way to figure out a possible syntax, but I might have looked at the wrong level.
Ultimately I feel like paket management should not be at the core of the language, but rather on top of it via some extension point. Than we could have multiple projects/builds using the core language and extending it with npm/paket/dotnet. Maybe even with their own (but similar) syntax.
Yes that's the goal. I already wrote that above. But in the experimental PR we try to get paket working for the prototype to learn what's needed from technical side. And then we should introduce another layer and remove all coupling to paket.
@ctaggart @smoothdeveloper @forki @KevinRansom @baronfel I updated the issue description to try to capture relevant information in a more structured way. Please take a look. In particular please check the design principles and see if you agree with them.
https://github.com/fsharp/fslang-suggestions/issues/542#issue-209556915
It might be best to move this to a very early-stage RFC (not specifically for any particular solution, but to make sure we capture the technical issues in a coherent and compete way)
@dsyme I'm thinking just on instinct that #paket "package" feels cleaner than #r "paket: package". And perhaps #npm and #nuget and whatever else. Perhaps not only restricted to package references but other stuff as well.
@isaacabraham Yes, I understand, thanks
@isaacabraham one concern I have with top level directives (without what @Rickasaurus was mentioning as "pragma plugin architecture") is that they seem to take significant footprint at root "namespace".
Having the #r
prefix with protocol (such as paket or else) makes it seem less invasive to add a custom handler, also in terms of error handling, handling an unknown #npm
directive:
FS0000 invalid directive #npm
the same way as #r "npm: package"
FS0000 unknown reference solver "npm", please check http://docs.fsharp.org/directives/r/register_reference_solver
I feel we should be conservative in the way we allow to extend directives to be handled by tooling / compiler, although I understand the appeal for broadly established concept such as #nuget
but maybe #r is already quite good anyways?
Edits:
#r paket:"nuget Package"
#r nuget:"Package-Version"
etc.?HI folks
Sorry I am late to the party, and it looks like a great one! Package loading for scriptcs has been in scriptcs for a while via packages.config
. We've had a long standing request for allowing inline specification via a directive like #p
.
Recently this was raised again, and I am keen for us to get something. Now that @forki has pointed me to this effort, I am even more keen. If possible, it would be great to have scriptcs use the same approach / mechanism as is being discussed here.
One question I have is how will semver / versioning constraints be expressed with this model?
One more question, is the design pretty locked yet (in terms of the directive syntax etc) or still in flux?
@glennblock regarding version ranges. You can use everything from https://fsprojects.github.io/Paket/dependencies-file.html
so it could look like this:
#r "paket: NUnit ~> 2.6.3"
#r "paket: DotNetZip >= 1.9"
....
open NUnit
And Paket will make sure that you get a sound resolution and all packages + transitive dependencies are loaded into the script.
is the design pretty locked yet
No this is the language design issue. We are still in discussion phase
FYI there has also been an ongoing discussion about this at https://gitter.im/dotnet-scripting/general
@glennblock Very much just an experiment so far. I think this will act as a forcing function to get some alignment.
Sounds good! scriptcs is down to be another consumer.
Let's do this! https://twitter.com/gblock/status/835596312787615744
how would we want to reference paket-files files?
@forki not sure we'd like this initially, do we? also there is question of referencing other scripts in nuget packages.
I'd go with a restricted subset of paket features, and try to iterate on making it stable and efficient in most cases / for most consumers (maybe we should consider a PR to IfSharp as well).
@forki And I'd like a way to reference a paket.depenencies
group as well...
@forki / @matthid by the time we would integrate more elaborate syntax / semantics, maybe we should then depend on Paket.Core and not have that external process / temp folder stuff happening (although for now it allows to bring the simple story with very little code)?
@forki for paket-files, I'd see this with #load @"paket:..."
more than #r
?
Depending on Paket.Core would mean depending on fixed version of Paket (unless implementation would include some crazy dynamic assembly loading story). Depending on external .exe
adds us option to easily update Paket and not being bound to VF# release cycle.
Yeah a second #load is fine the problem is guessing the path.
Am 26.02.2017 12:30 schrieb "Gauthier Segay" notifications@github.com:
@forki https://github.com/forki for paket-files, I'd see this with #load @"paket:..." more than #r?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/fsharp/fslang-suggestions/issues/542#issuecomment-282549520, or mute the thread https://github.com/notifications/unsubscribe-auth/AADgNCs4_Kuahft9fF9egaGiYZAYNyw9ks5rgWJDgaJpZM4MJFib .
ScriptCS has a convention for naming their scripts in nuget packages BTW, this is something we should consider learning from their experience.
See https://github.com/scriptcs/scriptcs/wiki/Script-Libraries where they have a structure such as:
ScriptCs.Calculator/Content/scriptcs/CalculatorMain.csx
I assume the file is always the package name folowed by Main.csx
[ edited by @dsyme to be a more comprehensive guide to this design question ]
[ Latest implementation is here: https://github.com/Microsoft/visualfsharp/pull/4042 ]
Package references in F# scripts
There has been a long standing desire to add the ability to reference nuget packages from F# scripts. Originally this was conceived as basic fully qualified nuget references like
packages.config
. Lately this has evolved into integrating "dotnet" or "paket" or "npm" package specifications and dependency management tools as part of the toolchain, or providing generic hooks to allow this.Related links
#r nuget
in RoslynDesign principles
You can add a package reference to a script with a single line using a normal text editor that supports F# - no extra files (e.g. a packages.config) are needed
Package references include version constraints, and dependency resolution is performed a.la. nuget v3 and/or paket
it works "at design time" , i.e. I can open a script containing package specifications and quickly get editing and type checking against a resolved set of packages - without needing to run any of the script or any command line tools
it works universally, i.e. it works on any default install of F# editing + scripting, whether Ionide or the Visual F# Tools or web-hosted delivery of F# scripting such as Azure Functions or Azure Notebooks.
It aligns well with how package management will be dealt with in the future of the .NET toolchain
it works "the same way" on both .NET Framework/Mono and .NET Core, at least to some approximation,.
It considers the needs of F# when used as a Javascript programming language through toolchains such as Fable or WebSharper. Here NPM is a natural package manager, though there are others.
The design and implementation do not induce a dependency on any one specific package manager within the core F# toolchain (i.e. compiler/scripting/editor/Fsharp.Compiler.Service/ProjectCracker). It might be that different package managers have some specific support to make them work, but we remain open to new package managers.
The implementation does not induce bad "layering problems" in the F# toolchain implementation reminiscent of MSBuild, see below.
It works efficiently - package resolution is amortized, for example.
The default settings in editing and execution tools are sufficiently space-efficient, sharing packages between scripts if needed to achieve this.
Possibiity. "dotnet" references in scripts
The way the new dotnet core tooling loads nuget packages and their assemblies is awesome! I've been using its extensibility to build a DotnetCliTool. All the dependencies are downloaded & loaded from the single
%userprofile%\.nuget\packages
directory. I would like to be able to use this same mechanism from scripts. I would like the same types of reference to be supported, package references and project references.Instead of having to use a nuget client to download the package and then reference the assembly like:
I want to be able to do:
Possibiity.
#project "my.fsproj"
references in scriptsPeople have suggested that project references should work the same way as package references to work in Visual Studio 2017 with the dotnet core tooling. e.g. be able to do:
Possibiity.
#r "paket: Foo.Bar.dll"
references in scriptsThe experiment https://github.com/Microsoft/visualfsharp/pull/2483 contains a prototype of integrating paket package management directly into the F# programming model for .NET Framework programming. This includes design-time support. The experiment violates one of the design principles above - it "bakes in" support for Paket only. However that support could be factored into either fsi.exe.
Possibility.
#r "packages"
references in scriptsPeople have suggested that a script simply referencing
#r "packages"
should implicitly pick up the packages from its context, e.g. the containingpackages.config
orsolution
orpaket.dependencies
in the toolchain. In the Experiment adding Paket support to FSI this is#r "paket"
Possibility. implicit generation of load scripts
Paket has a feature to accurately generate
.fsx
and.csx
load scripts containing the#r
references suitable for use with F# and C# scripting. This feature is a natural and simple way to integrate package resolution - simply have the package manager resolve the packages, generate the scripts and load the scriptsPossibility. sharing packages and package caches
A major question is where packages are cached. This is primarily a responsibility of the package manager, but becomes a serious issue for scripts because packages can't be duplicated for 100s of scripts, so some shared caching is needed.
Possibility. align with C
It is possible we should approach this in a similar way that C# repl and scripting want to approach the problem for .NET Core. There are shared concerns here and it is likely we should move forward with the same model for how assemblies and assembly versions are loaded in a scripting session.
Given that C# and F# scripting are also used for Azure Functions, it seems to me that a shared behavior here would be best. However this would mean spec'ing out that behavior all-up and perhaps building out an underlying component that F# and C# could sit atop.
ScriptCS also wants to align with a compatible mechanism: https://github.com/fsharp/fslang-suggestions/issues/542#issuecomment-282497990
Possibility. allow expression of SemVer version constraints
Paket and dotnet both have ways of specifying version constraints. The abiility to include these prior to package resolution is important.
https://github.com/fsharp/fslang-suggestions/issues/542#issuecomment-282498554
Possibility. autocomplete and search
Auto-completing package names gives a great way to search and discover package functionality.
Basic autocomplete is possible already in package specifications like
packages.config
andpaket.dependencies
. e.g. see autocomplete in IonideAdditionally auto-completing on search terms such as
#r "package: statistics
giving search of package description text would be helpful.Challenges
Some things make this tricky for F#
Challenge: Compiler architectural layering (basics)
The basic "natural" layering of the toolchain is
Here
[AssemblyResolution]
is a plugin-point or the mechanism used to resolve assemblies.Note that in this architecture the F# editing tools support the F# scripting programming model via
FSharp.Compiler.Service.dll
.This means that as things stand today the implementation of F# scripting is not "a tool on top of F#" (ala scriptcs) but actually part of
FSharp.Compiler.Service.dll
and thus pretty much universally supported in F# editors. The allows us to deliver scripting into a very wide range of contexts simply, efficiently and consistently, e.g. into Ionide, Azure Functions, Azure Notebooks and many online tools.With integrated package resolution one option is that this becomes
where
[PackageResolution]
indicates a potential plug-in point. An alternatives would be to build a layer "outside and on top" of FSI.EXEHowever, this approach doesn't seem to work particularly well with the incremental addition of package specifications in a REPL session.
Challenge: Compiling scripts and architectural layering
Traditionally the tool
fsc.exe
has supported the ability to compile F# scripts including their references. This has induced a violation of layering in the toolchain: the FSC.EXE tool also included the logic for assembly resolution and quite a lot of logic for processing F# files as scripts. This meant that the FSC.EXE tool and FSharp.Compiler.dll became badly dependent on the MSBuild API simply to resolve assembly references in scripts.Worse still this "leaked out" into the logical specification of the compiler itself. The compiler was now able to accept strange assembly specifications such as
-r:System. FooBar,Version=3.2.10,..
on the command line and resolve them with MSBuild. In the original world of .NET MSBuild shipped as part of the .NET Framework. However when MSBuid became separated this started to cause immense pain, and even more so when ,.NET Core came about. This problem was poisonous to whole toolchains built on FSharp.Compiler.Service, and we only recently did the hard work to "extract the rotten tooth" and make MSBuild optionalChallenge: .NET Core toolchain
The .NET Core toolchain changes some things about how F# compilation is surfaced. In particular it adds a layer
dotnet ...
via which all tools are accessed from the command line.. This means that with .NET Core the architectural layering becomes something like this:The question is where [PackageResolution] happens in this toolchain. But first there are other questions that need to be resolved:
Not all variations of F# running on .NET Core go through the
dotnet
tool. For example, clients of the FSharp.Compiler.Service.dll running on .NET Core such as the Fable compiler do not - they just embed F# parsing and checking directly via the compiler service DLL.Fable assumes an installation of "dotnet". However will it be the case that all clients of F# scripting (editors, engines etc.) can assume an installation of "dotnet"? If I embed F# scripting in an application, do I assume "dotnet" is installed? Do I have to download reference assembly packages from nuget to do simple F# script execution?
It is not clear what the specification of the F# .NET Core scripting engine will with respect to loading "uplevel" versions of the assemblies that are used to implement the scripting tool itself. This is discussed below.
It seems that these questions need to be resolved before integrated package management can be addressed.