Currently we use 16bit grayscale decimal values as 'input colours'. This is limited and not very user friendly, so we should change to colour values that people will actually recognize and cover the full spectrum of pixel colours that could be seen.
Converting 16bit grayscale decimal can be done by converting to 8bit decimal via x = (65535 >> 8) and then using the output x of 0-255 value to get rgb(x,x,x)
This will require changes to the command line argument inputs to accept RGB/dec/gray based input formats:
hexcode
rgb(#,#,#)
dec(###)
gray(0-65535) [this will be the full version of the current input of just an integer]
Currently we use 16bit grayscale decimal values as 'input colours'. This is limited and not very user friendly, so we should change to colour values that people will actually recognize and cover the full spectrum of pixel colours that could be seen.
See here for updated imagemagick command: https://www.imagemagick.org/discourse-server/viewtopic.php?t=11725
Converting 16bit grayscale decimal can be done by converting to 8bit decimal via
x = (65535 >> 8)
and then using the output x of 0-255 value to getrgb(x,x,x)
This will require changes to the command line argument inputs to accept RGB/dec/gray based input formats:
hexcode