When connecing a libvncserver-based client to a server and setting the pixel format to 32bit, the alpha channel of the pixels in the client framebuffer is always set to 0x00.
This is causing issues with my graphics libraries where an alpha of 0x00 is defined as fully transparent, thus not rendering to the screen. This means, I have to manually change all alpha-values in the framebuffer to 0xff before rendering to the screen, which in turn means a loss of performance. This probably applies to other bit depths as well.
It would be better to be able to configure the default opaque alpha value in linvncserver.
When connecing a libvncserver-based client to a server and setting the pixel format to 32bit, the alpha channel of the pixels in the client framebuffer is always set to 0x00. This is causing issues with my graphics libraries where an alpha of 0x00 is defined as fully transparent, thus not rendering to the screen. This means, I have to manually change all alpha-values in the framebuffer to 0xff before rendering to the screen, which in turn means a loss of performance. This probably applies to other bit depths as well. It would be better to be able to configure the default opaque alpha value in linvncserver.