Closed dcuccia closed 1 year ago
Hi David, I just pulled a clone of your fork onto Ubuntu. The key thing I would like to test is that MCCL runs on linux. You might have already done this(?) but nothing is like going through it yourself. I tried to run GetMonoLibs.sh and the solution is looking for libraries in the "packages" folder, e.g. Newtonsoft.Json etc. How did you build MCCL on linux? Carole
Don't need shell scripts or Mono anymore (.Net Core is cross-platform). Should just be:
1) Install .Net 3.0 on your machine:
https://dotnet.microsoft.com/download/linux-package-manager/ubuntu18-04/sdk-3.0.100
2) Update your shell to the project directory
3) type "dotnet build" to test that it builds
4) type "dotnet run" to run it
Aside - I did all of this on my Windows machine using WSL, so additionally, I first got that going:
https://docs.microsoft.com/en-us/windows/wsl/install-win10
Here's the distro (18.04) that I used:
https://www.microsoft.com/store/productId/9N9TNGVNDL3Q
Lmk if that works! David
(there's also "dotnet publish" with tons of packaging options which we can explore...local copy of the runtime, and even single file deployment)
Followed the install instructions on your first link. Then needed to run "dotnet build Vts.sln" to build. Built with no errors. Went to Vts.MonteCarlo.CommandLineApplication/bin/Debug/netcoreapp3.0 and there is a "mc" executable there (not mc.exe like previously). Ran "./mc geninfiles", good. Ran "./mc infile=infile_one_layer_all_detectors.txt", good. Plotted results using load_results_script.m, good.
So initial test looks good. I'm curious about how moving to 3.0 would affect our other dependencies, e.g. Matlab interop (did you run BuildTestRelease successfully?), our WPF GUI? I'd like to hear Lisa's input on this too.
So now I'm on my windows laptop with a clone of your fork. I did a "Restore NuGet Packages" but my rebuild is failing. So I did following: .In VS 2017: You can just go to the Tools → Options → Project and Solutions → .NET Core and then check Use previews of the .NET Core SDK, still failing So then went to dotnet.microsoft.com/download/dotnet-core/3.0 and downloaded SDK 3.0, extracted and .... Oh wait, I need VS 2019 to be compatible with this download. May have to install that later. I still have 2013 in case anyone has a SL bug and have been currently using 2017.
Sorry - should have said that, yes you need the latest 2019. Or can build from command line, or VS Code.
WPF GUI will always be Windows-only (but runs on Core). Probably Matlab interop, too, though I haven't looked lately at their product and any plans to support Core.
Okay back to Ubuntu v18.04 and VS Code v1.39.2. I like being on linux better anyway. Got a terminal up, was able to:
cd src dotnet build Vts.sln cd Vts.Test dotnet test Vts.Test.csproj Total tests:516 Passed:516 Total time: 4.1350 Minutes This is great however everything on my desktop froze while the tests ran. I did the same for Vts.MonteCarlo.CommandLineApplication.csproj and Vts.MonteCarlo.PostProcessor.csproj but no output came to the screen like it did for Vts.Test. Just the command line prompt returned.
I'll keep playing with it.
Not sure about the frozen desktop - assume your Linux machine is multi-core?
Re: console out, is there a difference in NLog configuration between the two?
Aside - there's logging and IoC containers built in to .Net Core, so with some work, we could remove NLog and (maybe) Unity.
(and testing...could remove NUnit as well someday)
David,
Our VTS library is already .NET Standard 2.0 and that version is also on NuGet. We have a .NET Core test application for the Vts project (version 2.0) and when we created the .NET Standard library, I created a .NET Core version of the MCCL also (https://github.com/lmalenfant/Vts.MonteCarlo.NetCore). In order to still support a .NET Framework version, we didn't update all the projects to .NET Core.
If you think it would be beneficial, we could add another MCCL project as a .NET Core 3.0 version and create a solution file just for those projects. We have a solution file that works in Visual Studio versions prior to 2017 and we would need to do this for users who do not have VS 2019.
Lisa
Hey Lisa,
Yeah definitely - .Net Standard for Vts.dll means we support all of the runtimes, and developers can use it wherever/however they want. On the tools/apps side of things, we've had a huge benefit at Modulim to consolidate and use the new tooling/infrastructure on Core. Unifying on Core for the runtime/apps significantly reduces the surface area and friction for a small team delivering new releases. .Net will no longer be developed, so access to any new capabilities is through Core anyway. It's particularly friendly for newbies in terms of using the CLI and lightweight tools to get started. Asking an existing Vts developer (small fraction compared to users) to update to the free VS 2019 Community or x-plat VS Code to work with Vts Tools 5.0+ seems like a very small ask. Makes us agile and drastically simplifies the codebase. Am I missing something?
David
Hi David,
I found a few problems with the MCCL unit tests. When I run "dotnet test" at the command line (not in VS Code) I get: Test Run Failed. Total tests: 13 Passed: 9 Failed: 4
I see what tests are failing. It has to do with a fix I made for a student when he was specifying command line parametersweep values that were small and needed more significant digits. The folder being created is named: myResults_mua1_0.019999999999999997 instead of myResults_mua1_0.02 so when the unit test checks if it exists, it doesn't.
I will fix this. Carole
I will work on fixing this.
Interesting, don't think I was seeing that on my end. Is this something that you've previously fixed on the main Vts project and didn't get pulled in correctly in my merge? Thanks for offering to fix. Will you send a PR to me, or fix on the main project? Either's fine.
This was documented in Issue #20 and fixed June 18, 2019. The change was in SimulationInputExtensions line 28 from String.Format("{0:f}", value) to String.Format("{0:g}",value).
What is a PR?
"Pull request" standard way for a repo author to submit changes to another repo. So, if you made that change to my forked repo (I need to give you write access), I could submit a PR to the main repo with all my changes thus far. Alternatively, if it's a bug you can reproduce on the main VTS project, much better to fix it there, and I can just pull those latest changes.
Here is an article describing the changes to formatting in net core 3.0 and why the unit tests are failing: https://devblogs.microsoft.com/dotnet/floating-point-parsing-and-formatting-improvements-in-net-core-3-0/
I'm using your clone to test on linux because it has the full solution and I can edit SimulationInputExtensions.cs in the Vts.dll and rerun Vts.MonteCarlo.CommandLineApplication.Test.
The article above suggests changing G to G15 to obtain equivalent behavior. I did and I still get 3 failed tests, but no straggler folders in bin/Debug/netcoreapp3.0. Oh.... I know what it might be, did you make the files in Vts.MonteCarlo.CommandLineApplication.Test/Resource embedded resources? That would explain the 3 failed tests.
I've made the change from G To G15 to our master Vts, all unit tests pass. I plan to push this fix to main repo.
Oh nice catch on both fronts! Yes, please modify if embedded resource is not correct. I had to overhaul the project files, so I may have made a translation error there.
I edited your Vts.MonteCarlo.CommandLineApplication.Test.csproj and added the following lines:
<EmbeddedResource Include="Resources\infile_unit_test_one_layer_ROfRho_Musp_and_Mus_inconsistent.txt" />
<EmbeddedResource Include="Resources\infile_unit_test_one_layer_ROfRho_Musp_only.txt" />
Now I get: Test Run Successful. Total tests: 13 Passed: 13 Total time: 1.6070 Minutes Yay!
I don't plan to do a PR. I'll let you make this change on your side.
Not sure why the lines didn't show. Here they are:"
"
Weird, okay how about this: EmbeddedResource Include="Resources\infile_unit_test_one_layer_ROfRho_Mus_only.txt" EmbeddedResource Include="Resources\infile_unit_test_one_layer_ROfRho_Musp_and_Mus_inconsistent.txt" EmbeddedResource Include="Resources\infile_unit_test_one_layer_ROfRho_Musp_only.txt"
each line starts with "<" and ends with "/>".
Thanks. Probably have to surround with a code block.
Carole, want to try to fork my repo, apply that commit change, and then submit my repo a PR? Could be good practice separate from the main project.
I did the following all on ubuntu: I forked a copy of your fork, made my changes, ran all unit test (they all pass), commited and pushed to my copy. Then I clicked "pull request" on GitHub, compared your fork with my fork and made the pull request. Let me know what happens on your side.
Worked great! Already merged in.
Thanks. Testing the link integration: https://github.com/dcuccia/VTS/pull/1
I think I should remove my fork from my personal GitHub site now so I don't get things confused. What do you think?
Totally up to you. Some people keep many "remotes" available in their merge tools, so they can pull changesets from multiple places/branches. I have VirtualPhotonics/VTS and dcuccia/VTS both in my Visual Studio setup, for example.
If we're close to agreeing that this can be merged into the main repo via a PR from me, then there's no reason to keep it around if it's just going to be confusing. If we need to do a lot more work to get this ready, then maybe keep it for now.
I think the plan to go to this in our main repo needs some discussion. I worked okay on linux but most of our users are on windows and we need to make sure they can still access our software if they have older tools (e.g. can't install VS 2019). Let's continue discussion in November after our grant goes out.
My hope was that this thread could be how we discuss...
Re: "we need to make sure they can still access our software if they have older tools (e.g. can't install VS 2019)"
I don't think we need to support old tooling in this case, per my argument above. Anyone can use the compiled tools as-is (just need the runtime), and there are multiple free ways for Vts developers to author new code on Windows or Mac: (VS Code, VS 2019 Community Edition, or command-line tools + any text editor). And VS 2019 runs side-by-side with older versions, so there's no reason this would disrupt other developer work.
That's assuming they are only using our library and are not integrating it with other software. I think we would like to do the due diligence to make sure we are not cutting off any resources.
I think adding in .NET Core 3.0 support is a great idea but I would definitely like to discuss the logistics. Having solutions that work in multiple versions of Visual Studio until we can be sure that the older versions are no longer needed would be good. Although the tools are free, if they are being used for commercial software, I believe you have to purchase a 2019 license.
I think the first step would be to create a solution that has the .NET Core versions of the software and the library but keep the other versions alongside for now. I would also like to make sure all the projects in the solution are still good to keep in the solution or if some of them would be better separated out.
We can keep discussing once the grant has gone out.
I went ahead and did some extra work on the .csproj structures, so that my fork now compiles...
On Windows:
Otherwise (Mac and Linux):
No need for a separate solution file, it works with one solution and build conditions at the project level.
Added build.bat/test.bat and build.sh/test.sh files, which all use one-liners to build/test the entire suite (e.g. "dotnet build src/Vts.sln"). All three paths verified building. There seems to be a type provider issue with running .Net 4.8 tests with the command-line .Net Core test runner on Windows (4.8 tests work in Visual Studio, though). All that's left to update are the Matlab package tests, which are Windows-only and should depend on the output of a Windows build of .Net 4.8 (compiled with either version of Visual Studio). The rest of the build scripts and Mono toolchain should now be able to be eliminated, and with publish profiles, the deployment and nuget packaging are simple tools to add and call directly within the Vts.dll project file (pack=true), as opposed to needing a separate script.
I also changed the way the InputProvider(s) resolve and test implementations, so that we're able with one-line to a) provide, and b) test all implementations of an interface. Limited scope of testing of detectors to those which implement DAW, but could easily compliment with those that implement CAW (which weren't previously being tested). This actually exposed and helped catch a bunch of construction bugs, where default property implementations were missing, which would throw NullReferenceException(s). I fixed all of these.
Finally, I worked a little bit on "discoverability" by making the out-of-box experience more palletable for those just getting started developing with Vts.dll. A simple reference to Vts.MonteCarlo, along with using the InputProvider(s), lets someone program against the API without having to know the ins-and-outs of the various namespaces and capabilities. See MonteCarloSimulationTests.validate_fluent_constructed_SimulationInput_runs_simulation_without_crashing for example use. Lots more we can do here, but I think it's a good start.
Please let me know what I can do to get this in the trunk, so I can start developing examples against it using nuget. Lots of cool tools available now for API demonstration and teaching, such as try-dotnet and Jupyter notebooks, which will benefit this project immensely. See for example:
https://www.hanselman.com/blog/AnnouncingNETJupyterNotebooks.aspx
Here is an example that works as-is with the current Vts.dll:
https://gist.github.com/dcuccia/35478199835b019d0c764d48e55d02da
Thanks, David
Hi David,
That is a lot of changes, when I get a chance I will pull your fork and take a look.
I did just want to respond and make sure that you know about our current NuGet package: https://www.nuget.org/packages/VirtualPhotonics.Vts/
We have a .NET Standard 2.0 version, isn't this what you are using for your .NET Core 3.0 development?
Once we commit to going to .NET Core versions of our software, Carole and I would need to update all documentation across all platforms so it will be more of an effort than just changing the source code. Also was there a reason you decided to move the .NET Framework version to 4.8 rather than just leaving them as 4.6.1 and moving forward with the .NET Standard version?
We have a meeting this week with the VP team so we can discuss next steps, I want to make sure our current users are not adversely affected by too many changes all at once.
Thanks, Lisa
Haven't touched Vts.dll - still .Net Standard.
I'm trying to modernize the platform to take advantage of the large new ecosystem of cross-plat tools. These let us simultaneously make it simpler for new developers to get started on all platforms (literally "apt-get" the core tools and "dotnet build" on Linux - compare that to our current situation) and makes simpler maintenance of our existing assets with a shared toolchain throughout. Presumably, this means a lower effort for documentation going forward.
4.8 gives us the best alignment with Core and access to new C#8 language features. It shouldn't matter to advance the client API by a minor digit.
When/where are you meeting?
D
Are you able to use the .NET Standard NuGet package?
Yes, but don't understand. Talking here about future architecture, platforms, features and testing, not about consuming the library.
Hi David,
In your message you stated that the urgency to get the new code into master was so you could start developing examples against it using NuGet. We already have this capability today.
> Please let me know what I can do to get this in the trunk, so I can start developing examples against it using nuget.
We have a plan for the library, dependent software and code going forward but we do not want to fast forward the technology all at once without doing the due diligence and having the time to make the updates to the relevant documentation and software packages. We are very excited about the .NET Core 3.0 changes that will allow us to run code across all platforms but we have a plan and do not want to make rush decisions due to new technologies being released.
With the NuGet package available in .NET Standard you should be able to use these new technologies in your own code.
We would love to have you contribute to the library but in a more granular way so we can review the pull requests and integrate them into the main code. For example the bug fixes that you made and the corrected unit tests would be a great addition.
Thank you, Lisa
@lmalenfant I think this is worth closing now. :)
FWIW, re-reading back through this discussion, I think that, when discussing changes/enhancements, it will be critical for us all to differentiate between tools, client applications, and core libraries when communicating benefits and drawbacks. E.g. client apps on multiple platforms can "consume" a .NET Standard 2.0 project. So, (as stewards of Vts.dll that other users consume) the platform choice of that core library should necessarily be more conservative/broad than the .NET platform choice for demo applications/tools that "we" ship as a team (to utilize the latest and greatest features of the platform for GUIs/APIs/hosting/etc). In my experience, the latest released .NET features in major revisions are very stable and reliable. The "preview" features that might change/shift have to be turned on explicitly, and we could decide not to support that, for example.
Hi friends!
On my fork, main branch, I've updated projects and references to be fully .Net Core 3.0/.Net Standard 2.0 compatible (it's up to date with the main project), and I've verified "dotnet run" for the mc.exe command line app now works out of the box on a vanilla Ubuntu distro (on WSL). This means (theoretically) we shouldn't need Mono, or any Linux-specific work-arounds. Also, between VS Code and VS for Mac, I don't think we should need to support MonoDevelop anymore.
I almost submitted a PR, but held back because there are multiple deployment assets that need to be updated or deleted. Very willing to do more work on this, but wanted to run by you guys. I've set the nuget package to 5.0.0-alpha01 (I don't have permissions to publish), and I've verified that a linux command line project referencing that alpha also works.
Let me know if you'd like to chat about it, David