Zal0 / ZGB

Game Boy / Color engine with lots of features
MIT License
706 stars 50 forks source link

smooth fades #20

Closed untoxa closed 2 years ago

bbbbbr commented 4 years ago

I'm pretty sure the fade UpdateColor() calculation can also be simplified for smaller and faster code using a LUT. Of course, the speed increase does not seem essential since FadeStepColor() is usually called by itself in a loop. (I used a variation of this with Petris.)

So this:

/RGB defined in cgb.h has a << 0 that kills the compiler
#define RGB2(r, g, b) ((UINT16)(r)) | (((UINT16)(g)) <<  5) | ((((UINT16)(b)) << 8) << 2);

#define PAL_RED(C)   (((C)      ) & 0x1F)
#define PAL_GREEN(C) (((C) >>  5) & 0x1F)
#define PAL_BLUE(C)  (((C) >> 10) & 0x1F)

UWORD UpdateColor(UINT8 i, UWORD col) {
    //return RGB2(DespRight(PAL_RED(col), i), DespRight(PAL_GREEN(col), i), DespRight(PAL_BLUE(col), i));
    return RGB2(PAL_RED(col) | DespRight(0x1F, 5 - i), PAL_GREEN(col) | DespRight(0x1F, 5 - i), PAL_BLUE(col) | DespRight(0x1F, 5 - i));
}

palette[c] = UpdateColor(i, *col);
palette_s[c] = UpdateColor(i, *col_s);

Becomes something like this:

#define FADE_LUT_SZ 6
#define FADE_MAX (FADE_LUT_SZ - 1)
const UWORD FADE_LUT[FADE_LUT_SZ] = 
                          {0b0000000000000000,  // No white present to mix
                           0b0000010000100001,
                           0b0000110001100011,
                           0b0001110011100111,
                           0b0011110111101111,
                           0b0111111111111111}; // Full white present to mix

// Fade in/out from White
palette[c] = *col | FADE_LUT[i]; 
palette_s[c] = *col_s | FADE_LUT[i];

// or

// Fade in/out from Black
palette[c] = *col & FADE_LUT[FADE_MAX - i]; 
palette_s[c] = *col_s & FADE_LUT[FADE_MAX - i];

If there is interest I can make a proper patch later.

The current ZGB + env package seems to be broken for building the example template on Linux at the moment, so I'd have to resolve that first (which I'd prefer to wait to do until the latest GBDK2020 changes settle down).

untoxa commented 4 years ago

The problem of color fade algo is that shifting color components makes color not to fade (become darker), but to change to anotger color. The correct algo should rather use conversion to HSV, change value component and then convert back. It is not hard, but require some tricks to make it all-integer and have required accuracy at the same time. Scaled integers do the trick.

bbbbbr commented 4 years ago

Right, the above ^^^ is just an optimization of the existing algorithm. I did a LERP version which looks better (still not a true hue calc) and can also fade to other palette colors but its more complicated and larger code. ( https://github.com/bbbbbr/Petris/blob/release/src/fade2pal.c )