Closed yblainm closed 1 year ago
Hi @yblainm, thanks for your question and welcome to the ILGPU community. Unfortunately, these vector types are meant to be used inside GPU kernels for improved throughput in terms of compute and memory bandwidth. And yes, the main purpose of ILGPU is compiling kernels written in .Net into GPU code that can be run natively on most GPU devices.
We are currently not supporting any of these high-level operations natively within ILGPU or ILGPU.Algorithms. However, @MPSQUARK is maintaining a high-level library around ILGPU being capable to turn vector and matrix operations into ILGPU-compatible kernels for acceleration.
Thank you! This answers my question. It's good to know about the work on https://github.com/MPSQUARK/BAVCL which I'll be keeping an eye on.
Hi @yblainm the BAVCL library is a project I am working on which aims to provide high level functionality and is based on ILGPU. What you describe, "similar to pythons numpy" but for C# using gpu acceleration, was the motivation behind it's development. The library is a work in progress but it has a lot of functionality, if there's anything you feel is lacking/missing feel free to raise an issue in BAVCL. I am an active member in the ILGPU discord server, and I do my best to attend all the talk to dev meetings, so if you have any questions you can find me there, and I do my best to keep up to date with the latest and greatest in ILGPU. @m4rs-mt Thank you very much for the mention.
Hi!
I see that there's been this branch merged https://github.com/m4rs-mt/ILGPU/pull/1023 which piqued my interest in this library, which as I understand it is primarily meant for JIT compiling kernels written in C# for various backends.
My question is: does ILGPU support vectorized operations and array math similar to Python's numpy, or NumSharp since the recent work on Vector types? I'm quite inexperience in writing kernels and am not very familiar with this codebase, plus I can't quite find documentation that explains it, though maybe I haven't searched well enough.
Thanks for your work and attention!