The foreground and background colors of a cell are 0xRRGGBB, but the terminal is free to reduce this to as few as 8 colors, or even 2 if it is a monochrome device.
This is complicated:
As you know, in terminal emulators "16 colors" is not a simple subset or dithering of "256 colors". Theming of the 16 colors is ubiquitous, into different shades or even into completely different colors. We should probably work from the assumption that the 16 themable colors are a completely disjoint set from the 256 non-themable colors.
The 256 colors actually are themable as well, at least in XTerm, but users rarely change them since there are so many of them. However, the notcurses library now does some palette cycling effects using these colors.
The ISO 24-bit color codes are not as widely supported as the XTerm 256 color codes, and take a lot more bandwidth. Bandwidth is important because OS pty drivers still have very small buffer sizes, and being able to send a screenful of updates all in one buffer is highly prized as a way to minimize flicker.
Hence, the implementation should probably downgrade 24-bit colors to 256 ones whenever there is an alias, purely to save bandwidth. But how to figure that out when the 256-color palette is customizable? Reset it at the start? What if it was customized by the user and not an application? We would then throw away the user's customizations. Read in the old palette at the start and restore it when exiting?
@johnwcowan wrote:
This is complicated:
As you know, in terminal emulators "16 colors" is not a simple subset or dithering of "256 colors". Theming of the 16 colors is ubiquitous, into different shades or even into completely different colors. We should probably work from the assumption that the 16 themable colors are a completely disjoint set from the 256 non-themable colors.
The 256 colors actually are themable as well, at least in XTerm, but users rarely change them since there are so many of them. However, the notcurses library now does some palette cycling effects using these colors.
The ISO 24-bit color codes are not as widely supported as the XTerm 256 color codes, and take a lot more bandwidth. Bandwidth is important because OS pty drivers still have very small buffer sizes, and being able to send a screenful of updates all in one buffer is highly prized as a way to minimize flicker.
Hence, the implementation should probably downgrade 24-bit colors to 256 ones whenever there is an alias, purely to save bandwidth. But how to figure that out when the 256-color palette is customizable? Reset it at the start? What if it was customized by the user and not an application? We would then throw away the user's customizations. Read in the old palette at the start and restore it when exiting?