Closed untoxa closed 2 years ago
The problem of color fade algo is that shifting color components makes color not to fade (become darker), but to change to anotger color. The correct algo should rather use conversion to HSV, change value component and then convert back. It is not hard, but require some tricks to make it all-integer and have required accuracy at the same time. Scaled integers do the trick.
Right, the above ^^^ is just an optimization of the existing algorithm. I did a LERP version which looks better (still not a true hue calc) and can also fade to other palette colors but its more complicated and larger code. ( https://github.com/bbbbbr/Petris/blob/release/src/fade2pal.c )
I'm pretty sure the fade UpdateColor() calculation can also be simplified for smaller and faster code using a LUT. Of course, the speed increase does not seem essential since FadeStepColor() is usually called by itself in a loop. (I used a variation of this with Petris.)
So this:
Becomes something like this:
If there is interest I can make a proper patch later.
The current ZGB + env package seems to be broken for building the example template on Linux at the moment, so I'd have to resolve that first (which I'd prefer to wait to do until the latest GBDK2020 changes settle down).