STBI_REALLOC_SIZED [1] in stbi__load_gif_main may overflow when layers is bigger than 1. However it doesn't seem to be exploitable because layers is incremented by one in the loop [2]. The first occurrence of layers == 2 will cause the signed integer overflow resulting in negative number, which casted to unsigned 64 bit number size_t in realloc will cause the realloc to fail and the program to return in [3].
static void *stbi__load_gif_main(stbi__context *s, int **delays, int *x, int *y, int *z, int *comp, int req_comp)
{
...
do {
u = stbi__gif_load_next(s, &g, comp, req_comp, two_back);
if (u == (stbi_uc *) s) u = 0; // end of animated gif marker
if (u) {
*x = g.w;
*y = g.h;
++layers; // [2]
stride = g.w * g.h * 4;
if (out) {
void *tmp = (stbi_uc*) STBI_REALLOC_SIZED( out, out_size, layers * stride ); // [1] int overflow
if (!tmp)
return stbi__load_gif_main_outofmem(&g, out, delays); // [3]
else {
out = (stbi_uc*) tmp;
out_size = layers * stride;
}
if (delays) {
int *new_delays = (int*) STBI_REALLOC_SIZED( *delays, delays_size, sizeof(int) * layers );
if (!new_delays)
return stbi__load_gif_main_outofmem(&g, out, delays);
*delays = new_delays;
delays_size = layers * sizeof(int);
}
} else {
out = (stbi_uc*)stbi__malloc( layers * stride );
if (!out)
return stbi__load_gif_main_outofmem(&g, out, delays);
out_size = layers * stride;
if (delays) {
*delays = (int*) stbi__malloc( layers * sizeof(int) );
if (!*delays)
return stbi__load_gif_main_outofmem(&g, out, delays);
delays_size = layers * sizeof(int);
}
}
memcpy( out + ((layers - 1) * stride), u, stride );
...
Impact
It doesn't look like a potential security issue, but the signed integer overflow behavior is undefined according to C/C++ standard.
Set breakpoint at line 6993 in stbi__load_gif_main and run the program to hit the overflow.
/src/stb/tests/../stb_image.h:6993:39: runtime error: signed integer overflow: 2 * 1755853020 cannot be represented in type 'int'
SUMMARY: UndefinedBehaviorSanitizer: undefined-behavior /src/stb/tests/../stb_image.h:6993:39 in
STBI_REALLOC_SIZED
[1] instbi__load_gif_main
may overflow whenlayers
is bigger than 1. However it doesn't seem to be exploitable becauselayers
is incremented by one in the loop [2]. The first occurrence oflayers == 2
will cause the signed integer overflow resulting in negative number, which casted to unsigned 64 bit numbersize_t
inrealloc
will cause therealloc
to fail and the program to return in [3].Impact
It doesn't look like a potential security issue, but the signed integer overflow behavior is undefined according to C/C++ standard.
Resources
To reproduce the issue:
stbi__load_gif_main
and run the program to hit the overflow.