Closed rpglover64 closed 8 years ago
Oh yes, support for colour blindness is definitely needed.
Vischeck simulations of http://tools.medialab.sciences-po.fr/iwanthue/theory.php for various types of color-blindness:
The Vischeck simulations aren't perfect, for example, all the "color chips" are full color (i.e. not simulated). Also, the "Taking benefits from custom color spaces" bar-graphs appear to have disappeared. (I'm not color-blind myself, so I don't know if their disappearance is a poor simulation artifact or a genuine color-blind challenge with the iWantHue color scheme.)
It would be great I agree. The process in iWantHue might work for color-blind-friendly color-spaces. By the way I do not have the time to integrate the conversions between common color spaces and these color spaces. Also, I do not know if it is possible to make it work for the different types of color blindness at the same time.
I used Chroma.js and the conversions are also available in d3.js. I guess that some of these transformations are available somwhere in the vischeck code. Actually what I need is just the way to re-evaluate the distance between two colors, taking in account one or several types of blindness.
+1
Any tips for generating colorblind-friendly palettes?
Hi Eric, I'd love to contribute and find a way to do generate color blind palettes but the only information I could find is how to create colors for color blind test. May be we reverse engineer the algorithm to generate distinct colors that are color blind safe?
http://mudcu.be/labs/Color-Vision/Javascript/Color.Vision.Simulate.js
OK let me help you. In iWantHue we rely on a computation of perceptive distance between colors. This distance is the classical distance computed in the CIE LAB color space. But we can just tweak the distance computing to take color blindness in account.
I just factored distance computing in the code, and edited it to include a quick solution. I did not integrate that in the UI, but I may do it soon (takes more time). The new lines are there: https://github.com/medialab/iwanthue/blob/master/js/libs/chroma.palette-gen.js#L263-L278
The simple idea is that the a* channel in CIE LAB is the perceptive contrast between red and green. By just omitting it in the distance computation, we have actual the perceptive distance for red-green deficient vision.
We do not want to completely eliminate the non-color-blind contrast because the information is still relevant. For me the question is how to aggregate color-blind and non-color-blind distances. Some combination of min, max and coefficient probably makes more sense...
I will integrate this to the UI if it makes sense and if I have more time in the next days
Ok this is now done! Online today. Not very impressive visually, but measurably efficient.
It seems like a moderately difficult addition for great accessibility gain.