Closed FIVIL closed 4 years ago
@FIVIL Great question. GPUs are excellent for accelerating DNN training and inference. Currently we do not offer DNN learners. At this point, we do not have GPU-accelerated components available. However, ML.NET is a general-purpose machine learning framework. It can support a variety of machine learning algorithms, including neural nets such as Cognitive Toolkit (CNTK). Adding support for leading DNN packages within ML.NET is on our roadmap. We will likely support GPU training and inference in the future for specific learners and transforms that can benefit from it.
In the readme you say this lib is used by Bing & Co. - is this really true given that you don't support GPUs?
R has GPU support for a few things besides neural nets: http://www.r-tutor.com/gpu-computing
I agree, this should be priority. I have a GPU Crypto-mining farm and It would be nice to use that from time to time for machine learning computation aswell.
And I think serious data / applications would really want it, I think this should be first priority
Sorry to say, but
No GPU support = no real use in production
Any progress on adding GPU support? I agree with the sentiment that without GPU support, this is not a viable solution for serious ML applications. I wish there were a better .NET based framework for using CNTK which does support GPU. Seems natural the ML.NET would be exactly that, but wince there is no support for this currently, it's a big problem.
agreed. the recent resurgence of popularity in AI/ML/deep-learning is because today's GPUs are far more accessible than super-computing centers. if you build it, they will come.
@FIVIL Great question. GPUs are excellent for accelerating DNN training and inference. Currently we do not offer DNN learners. At this point, we do not have GPU-accelerated components available. However, ML.NET is a general-purpose machine learning framework. It can support a variety of machine learning algorithms, including neural nets such as Cognitive Toolkit (CNTK). Adding support for leading DNN packages within ML.NET is on our roadmap. We will likely support GPU training and inference in the future for specific learners and transforms that can benefit from it.
Working on a high level wrapper to build and execute kernel written in C# on special hardware: https://www.tech-quantum.com/gpu-accelerated-programming-with-amplifier/ https://github.com/tech-quantum/Amplifier.NET
People on this thread asking for GPU support, can you please specify the scenario where this might be beneficial for you? Please note adding DNN training support is on our roadmap in the coming months and that WILL have GPU support but I want to be sure if there is some other scenario that I might be missing that might benefit from GPUs?
Realtime footage analysis at 4K resolution and 60 frames per second.
People on this thread asking for GPU support, can you please specify the scenario where this might be beneficial for you? Please note adding DNN training support is on our roadmap in the coming months and that WILL have GPU support but I want to be sure if there is some other scenario that I might be missing that might benefit from GPUs?
are you saying we should train models that need GPU in another framework and then consume it in ML.NET
@Mercurial no, DNN in ML.NET will support training on the GPU. And you can also consume it on the GPU. So you won't need to train on another framework.
@Mercurial no, DNN in ML.NET will support training on the GPU. And you can also consume it on the GPU. So you won't need to train on another framework.
ah I see, thanks @ebarsoumMS I guess just needs more understanding on this part!
@Mercurial Where did I suggest you should train models in another framework and then consume in ML.NET? I think I was clear that we plan to add DNN support in the coming month and it WILL have GPU support.
closing this as it original question has been answered
@Mercurial Where did I suggest you should train models in another framework and then consume in ML.NET? I think I was clear that we plan to add DNN support in the coming month and it WILL have GPU support.
Thanks for the clarification
Anyone on this thread looking for GPU acceleration for prediction in a non-DNN-based model? Would be interested in discussing your use case further, if so.
I've migrated to NVIDIA Triton Server and GRPC for inference. It decouples inferencing from my .NET code and is way better for model management, versioning and load-balancing across multiple GPUs. It's also much faster than Tensorflow Serving, should anyone wonder.
closing this as it original question has been answered
hi, is there (or will there be) any support for training and doing stuff on gpu using ML.net too or no???