Closed dpeterc closed 2 years ago
Thanks. I imagined it might be, but having only limited platform access I have to wait for reports like this. I'll dig into it and make a fix.
Since you seem to know a lot about X, I'll ask you! I am struggling to make sense of "significant bits in color specification" (bits_per_rgb
). If I interpret it as how many bits are in each color channel then it doesn't agree with the RGB mask, as those are each 8 bits in your example, and neither does that agree with the color depth (24 or 32).
The documentation and header is sparse:
The bits_per_rgb member specifies the log base 2 of the number of distinct color values (individually) of red, green, and blue. Actual RGB values are unsigned 16-bit numbers.
The XLib Programming manual, Rel 5 has the most information:
The bits_per_rgb member specifies how many bits in each of the red, green, and blue values in a colorcell are used to drive the RGB gun in the screen. For a monochrome screen, this value is one. For the default visual of an eight−plane color system, this value is typically eight. The pixel subfields (the red, green, and blue values in each colorcell) are 16−bit unsigned short values, but only the highest bits_per_rgb bits are used to drive the RGB gun in the screen. This number corresponds the number of bits of resolution in the Digital to Analog Converter (DAC) in the screen
Now this is of course using some CRT terminology, but my reading is that this describes the color depth which is ultimately found in the data transferred from the X server to the display. So, as you suggested, perhaps it is all internal and I shouldn't worry about it all?
Can you explain a better understanding of this field or point me to a resource?
X11 was created at the time when graphics memory was very scarce, so manufacturers used different strategies to pack more pixels in a limited space. I have spent a considerable time of my programming career fighting and optimizing color maps, using XAllocColorCells(), where some of the 256 color cells would be occupied by the operating system, and the others were shared among the applications. Some SGI computers would display active window in correct colors, and other windows in false color-map. True color screen was a luxury few people have been able to afford in 1990. For such case of colormap X11 displays, the pixel value was one byte (8 bits), while the colormap RGB value would be 16 bits, but the DAC of the graphics card would actually use lower number of bits; usually 8. Early low quality LCD screens would shear off some bits and display only 5 or 6 bits per R, G, B.
Also, since addressing RAM mostly one byte at the time, it does not make sense to pack RGB values more tightly, so they are always accessed at byte level, even if not al the bits are actually used by the display. The value of bits_per_rgb tells you the number of bits which are used in display. Even if it will display less or more bits, you don't want to modify your image for that, unless you are makin some RAW photo image editor.
Bottom line, from the programming point of view, in 2022, you should only care about 24 and 32 bit color depth or your application. You can presume significant bits in RGB to be 8. If there are more, than the graphics system is capable of displaying colors in higher precision, but it is poorly supported at the moment. https://linuxreviews.org/HOWTO_enable_10-bit_color_on_Linux https://dec05eba.com/2021/10/10/x11-myth-x11-doesnt-support-hdr/
You have too strict checking of visuals on startup. I have an Nvidia graphics card on Linux with 11 bits per rgb, so program fails on this check:
But if I comment it out, it works properly, so this check is either not necessary or incorrect.
Output of xdpyinfo on my computer
name of display: :0 version number: 11.0 vendor string: The X.Org Foundation vendor release number: 11906000 X.Org version: 1.19.6 maximum request size: 16777212 bytes motion buffer size: 256 bitmap unit, bit order, padding: 32, LSBFirst, 32 image byte order: LSBFirst number of supported pixmap formats: 7 supported pixmap formats: depth 1, bits_per_pixel 1, scanline_pad 32 depth 4, bits_per_pixel 8, scanline_pad 32 depth 8, bits_per_pixel 8, scanline_pad 32 depth 15, bits_per_pixel 16, scanline_pad 32 depth 16, bits_per_pixel 16, scanline_pad 32 depth 24, bits_per_pixel 32, scanline_pad 32 depth 32, bits_per_pixel 32, scanline_pad 32 keycode range: minimum 8, maximum 255 focus: window 0x4e00006, revert to PointerRoot number of extensions: 29 BIG-REQUESTS Composite DAMAGE DOUBLE-BUFFER DPMS DRI2 GLX Generic Event Extension MIT-SCREEN-SAVER MIT-SHM NV-CONTROL NV-GLX Present RANDR RECORD RENDER SECURITY SHAPE SYNC X-Resource XC-MISC XFIXES XFree86-VidModeExtension XINERAMA XINERAMA XInputExtension XKEYBOARD XTEST XVideo default screen number: 0 number of screens: 1
screen #0: dimensions: 3840x2160 pixels (712x400 millimeters) resolution: 137x137 dots per inch depths (7): 24, 1, 4, 8, 15, 16, 32 root window id: 0x1ef depth of root window: 24 planes number of colormaps: minimum 1, maximum 1 default colormap: 0x20 default number of colormap cells: 256 preallocated pixels: black 0, white 16777215 options: backing-store WHEN MAPPED, save-unders NO largest cursor: 256x256 current input event mask: 0xfa8033 KeyPressMask KeyReleaseMask EnterWindowMask
LeaveWindowMask ExposureMask StructureNotifyMask
SubstructureNotifyMask SubstructureRedirectMask FocusChangeMask
PropertyChangeMask ColormapChangeMask
number of visuals: 132 default visual id: 0x21 visual: visual id: 0x21 class: TrueColor depth: 24 planes available colormap entries: 256 per subfield red, green, blue masks: 0xff0000, 0xff00, 0xff significant bits in color specification: 11 bits visual: visual id: 0x22 class: DirectColor depth: 24 planes available colormap entries: 256 per subfield red, green, blue masks: 0xff0000, 0xff00, 0xff significant bits in color specification: 11 bits ...