Open nmfisher opened 7 years ago
Hi nmfisher,
Thanks for your interest in SharpLeraning, I am glad you find it useful.
Regarding using Math.Net matrices as a replacement for the SharpLearning matrices, it is a bit of a long story with several considerations.
Initially I created my own matrix class to avoid dependencies on other libraries. The main consideration being that without dependencies it is easier to change direction, for instance, to support .net core and .net standard for multiplatform support.
This changed when I added the SharpLearning.Neural project, which is using Math.Net for the matrix operations to utilize MKL/OpenBLAS. Overall, I think Math.Net is a great library, and switching from SharpLearning’s matrix implementation to Math.Net matrices is something I have considered more than once.
However, I have been a bit reluctant to take the step for a few reasons:
Personally, I would like to wait with changing the matrix implementation until a decision has been made on introducing a tensor class (yes or no), since the two changes will overlap a lot.
However, if you go ahead and integrate SharpLearning and Math.Net in a fork, there are a few things to be aware of:
I am keeping the issue open, and if you do proceed with integrating the two, I will be interested in following your progress.
Thanks for the detailed response. Definitely appreciate how much effort you've gone to.
I understand your desire to limit dependencies, particularly as the discussion re .NET Standard support in Math.NET Numerics is ongoing.
Your point about CNTK is interesting. Personally, I am using Math.NET to implement various neural word embedding models (e.g. word2vec) in C#. I know MS is planning to expand CNTK support to .NET/C#, but I don't have the luxury of waiting to see if I can leverage this. This means I'll have to implement these models manually (though knowing my luck, CNTK .NET support will be released the exact same day I finish my own implementation).
In other words, if I could standardize on CNTK, I would. Unfortunately it seems I don't have that option yet.
For the time being, I will hold off on forking to support Math.NET. Best to wait to see what's in store for .NET/C# CNTK support first.
A Tensor interface does sound reasonable, though, so if/when you decide to proceed with this, I would also be happy to help test.
I believe Microsoft will release the initial/preliminary training API for C# during September/October. At least, it is included in their August – September iteration plan: https://github.com/Microsoft/CNTK/issues/2194
So hopefully, we will have something in the not so distant future :-)
Regarding introducing a Tensor interface, you are more than welcome to help contribute and test. If I proceed with it, I need to figure out a design so it will integrate efficiently with both the existing matrix implementations and with CNTK. When this is done, I will try to add issues describing what needs to be done, this should make it easier to pick op tasks and help improve the library.
In general, I have a lot of ideas for new features and improvements to SharpLearning. Currently, I have these stored in a private backlog, but they might as well be added as issues for the project, making it easier to help contribute.
You are also more than welcome to add issues and contribute if you find something lagging or have ideas for new features.
It seem like microsoft will eventually add support for a Tensor type. So if a tensor class is to be introduced SharpLearning, it would probably make sense to use the microsoft implementation.
This is also relevant for: #20
This is a really great library. Was there a specific reason why you chose to roll your own Matrix class, rather than leveraging Math.NET?
Ideally I'd like to marry the two (not only for consistency with modules I've already written, but even for smaller things like using Matrix rather than Matrix). Before I jump in and start changing anything, though, I thought I'd check with the author to see if there was a specific reason behind it.
If I do proceed with integrating the two, more than happy to submit back a PR too, just let me know.