Closed adevur closed 4 years ago
I am also very much interested whether it will work. UHD Graphics can be sliced only so much because it is too under-powered.
So a positive answer by an Intel employee has been removed. What is the meaning of this? Is support undecided or will DGX not support GVT-g?
I think this feature would be very welcome by the community and would definitely foster sales, IMHO.
Internally we have some prototype but there is no formal plan to support GVT-g on discrete GPU like DG1. In a client platform which has had integrated GPU, are there still needs to share DG1 among multiple virtual machines? Thanks!
I had some high hopes to be able to use the new card for virtualization. I guess the big ones from AMD/NVidia will still be only options for a small VM farm. Will integrated XE support it at least?
@zhiyuanlv I am sorry to bother you, but are there any plans to have GVT-g support on the newest Intel CPUs having Intel Xe based GPU cores and meant to power laptops? I cannot find any information regarding that anywhere.
I am really interested, because I use this feature with Windows VMs.
Strongly recommend this be re-evaluated. The demand for accelerated and isolated containers/VMs as well as the entire distributed work landscape is just not the same as it was back in 2020 when this was originally put on ice.
I dont think that this will ever happen, Nvidia and AMD don't do it and not many people even knew about GVT-g (it took me a long time to figure out about it) and since it requires Linux, it's use case is niche. Also GVT-g is believe it or not, closely linked to hardware, I do believe that rocket lake desktop has it but it is just disabled. But XE and otherwise don't.
I do agree, that it would be great if intel supported this on their new GPU's I don't think it will happen. If they do I am probably going to buy one right away especially if it has framebuffer capturing for direct video stream or physical output. There might be a way to do it on the DG1 but I think that Intel just won't support it unfortunately because there is no reason for them too. While people like us who use kvm love it, most people with Intel igpus don't even know they have it. So Intel probably doesnt want to wait money developing it.
In a business sense it could actually make sense if they added physical output for multiple workstations off 1 GPU, or just having the raw horsepower of a dedicated GPU for video editing or 3D accel, over something like parsec.
I strongly disagree with that sentiment. While most consumers don't use GVT-g or linux or do paravirtualization for that matter, it is of mayor importance in the enterprise space, where VMs are widely used. To make things worse Nvidias grid GPUs that have vGPU support cost a fortune and require expensive monthly subscription fees to work, and even then you are only able to split the GPU into chunks with identical vRam allocation. If you use vGPU-unlock and vGPU-unlock-rs you can get around the licensing and have the option to allocate vRam, implement frame limiters, and a whole lot of other stuff even on consumer GPUs that enterprise GPUs lack in features. AMD offers MxGPU. While it doesn't have the same kind of licensing(that quite frankly feels like robbery seeing how expensive Quadro and grid GPUs are from Nvidia), Nvidia offers generally better hardware in the enterprise space and that aside GIM(the kernel level software to make MxGPU work) quite frankly sucks. It is only supported by AMD for VMWare(not qemu/kvm or proxmox) and it doesn't provide the same kind of xml options that Intel provides, instead splitting up the GPU to be visible as multiple PCI devices, which while it shouldn't may end up requiring ACS patching(which is insecure).
Overall Intel should enable GVT-g for their GPUs, to give them an edge in the future to be competitive in the enterprise space. It is most certainly the cleanest solution, I've seen so far.
Will the new dedicated video cards from Intel (e.g. Intel DG1) support GVT-g virtualization? It would be an awesome feature and an important advantage over NVIDIA and AMD.