Closed mcdull closed 2 years ago
In short: No. To get display output directly from the VM you need to run the GPU in passthrough mode, and then you are limited to one VM per physical GPU.
In single gpu senerio, most people do not need host to have continous display output, after boot up. In old age, we passthrough the entire gpu together with display output to an VM, which then the host will detach the display.
When it comes to vgpu unlock, we still wants the display output in VM level, is it possible with exsiting development? I saw in wiki that we can have merged driver so that both host and VM can share the vGPU and at the same time the host can have display. Is it possible to designate the output to a single VM, while the other VM can still share the gpu?
Or if it is a must to install X in host (e.g. proxmox) and share the screenbuffer? Thanks.
I'm not sure I understand. As I see it there are two use cases:
Currently the headless server use case is pretty much doable. Using something like Proxmox you can remotely manage VMs and remotely access them using vnc/rdp/etc.
The desktop use case is more complicated. In order to get display output you either need to use a separate GPU (which is what I use in my test setup), or "merge" the vGPU driver with the game ready driver to get access to the display outputs from the host (this probably creates its own problems, I have not attempted this). Then you need some way to get the pixel data from the guest onto the host and render it in a user friendly way. For this I have used https://looking-glass.io/, which works pretty well, but isn't as seamless as I'd like, and doesn't handle multiple guest displays.
So more work is required to meet the desktop use case.
Yes. Its the desktop use case I refer. However, when it comes to proxmox, which is the preferred way to run VM, is usally not desired to run the desktop mode. In many consumer setup, e.g. proxmox / unraid, the desktop should be designated to a single VM (running windows for games). And the same machine were also used for many other purpose.
If there are 2 display adaptor in the server, there are much more possibility, but with modern motherboard and server needs (extra SATA / LAN), multiple GPU is quite difficult.
All in all, thanks alot for your work and feedback. Very very much appreciated.
No further updates
In single gpu senerio, most people do not need host to have continous display output, after boot up. In old age, we passthrough the entire gpu together with display output to an VM, which then the host will detach the display.
When it comes to vgpu unlock, we still wants the display output in VM level, is it possible with exsiting development? I saw in wiki that we can have merged driver so that both host and VM can share the vGPU and at the same time the host can have display. Is it possible to designate the output to a single VM, while the other VM can still share the gpu?
Or if it is a must to install X in host (e.g. proxmox) and share the screenbuffer? Thanks.