Closed liuty10 closed 3 years ago
Your question is all over the map. Please tell me what you are ultimately trying to accomplish.
I am trying to make the network bandwidth lower when running 3D games using TurboVNC. A possible idea is to use video encoding and streaming. So, I have implemented x264 encoding and rtsp streaming when the 3D images are recieved in shm.c. I have found that the bandwidth decreases significantly.
The problem is that I can't stream the desktop or other 2D applications to the client side when getting image from ProcShmPutImage() function. Do you think that I should use XGetImage() function to get the screenshot?
OK, but the bandwidth decreases significantly compared to what? Have you tried enabling interframe comparison in TurboVNC and using a lower JPEG image quality? TurboVNC's default settings are designed for high-bandwidth networks, so you should always adjust interframe comparison and JPEG quality (including subsampling) when using lower-bandwidth networks. Referring to this article, H.264 isn't a clear winner when compared to the TurboVNC codec. It depends very heavily on the type of image workload and the quality settings.
As that article also describes, encoding the entire desktop with H.264 is much trickier than just encoding the frames rendered by a 3D application, because X servers do not have a concept of "frames". An X11 GUI may only update a few pixels at a time, and if that's the case, you don't want to encode the entire desktop (particularly not with x264, which is very CPU-intensive) just to get those few changed pixels. Unfortunately, H.264 in general does not provide any way of specifying which pixels have changed from frame to frame (because it was designed for movies, not remote desktops), so if you keep feeding the entire desktop image into x264, it would have to analyze the desktop image every time in order to determine which pixels had changed. Tracking framebuffer changes is complex, and that complexity is a big reason why the TurboVNC Server has to be tightly integrated with X.org. Our RFB server implementation basically acts as an X.org driver, so not only can it maintain its own virtual framebuffer and access that framebuffer at will, it also knows whenever an X11 application has updated a particular region of the framebuffer and how that update took place (refer to unix/Xvnc/programs/Xserver/hw/vnc/draw.c.) Since the X server is single-threaded, the RFB server has to make some rather complicated decisions regarding how and when to coalesce those updated regions into a single "framebuffer update" and send it to all connected VNC viewers. It isn't just a matter of reading back the entire framebuffer periodically.
I would suggest that you do the following:
Ultimately, if you are bypassing our codec and transport layer entirely, then you aren't really using TurboVNC anymore. TurboVNC without the codec and transport layer is basically just Xvfb with some added glueware for session management and window manager startup.
Note that there is an H.264 RFB encoding type defined in the spec. No implementation details are provided in the spec, but this commit seems to provide some insight into how it might be done (however, that commit doesn't seem to have been pushed into the upstream LibVNCServer source, so I don't trust it yet.) What I'm driving at is: it would be more straightforward for you to use TurboVNC as-is and add an H.264 RFB encoding to it, which you could use for transmitting only pixels drawn with MIT-SHM.
Hi @dcommander, thank you so much! This is a very good project. I am trying to make my setups work on public network.
Yeah, I used the default settings of TurboVNC, and that might be the main reason why network bandwidth is so high. I will follow your advice and do some comparison study based on your article above. I will let you know if I have any findings.
I have tried to encode the GUI of 3D game and stream the package to client side using RTSP. I can get the image of 3D game in Xext/shm.c, and stream them to the client side successfully.
Now, I also would like to stream the desktop to client side. I have read the code in UpdateFrameBufferRequest() but failed to understand how to get all the content of the desktop (screenshot). Could you give me some hints on this problem? How can I get the desktop content gracefully? In shm.c or rfbserver.c