gecko0307 / dlib

Allocators, I/O streams, math, geometry, image and audio processing for D
https://gecko0307.github.io/dlib
Boost Software License 1.0
217 stars 33 forks source link

image.d(288): Range violation #174

Closed naturalmechanics closed 1 year ago

naturalmechanics commented 3 years ago

If i want to create an image with this code:

auto image = new Image!(PixelFormat.RGB8)(to!int(width), to!int(height));
image[mapx,mapy] = Color4f(1,1,1,1);

where[ width, height ] = [299341, 270985], I run into the error : core.exception.RangeError@../../../../.dub/packages/dlib-0.20.0/dlib/dlib/image/image.d(288): Range violation

I looked in the image.d file. The relevant code is :

 size_t index = (y * _width + x) * _pixelSize; 

        static if (fmt == PixelFormat.L8)
        {
            pixData[index] = cast(ubyte)c.r;
        }
        else if (fmt == PixelFormat.LA8)
        {
            pixData[index] = cast(ubyte)c.r;
            pixData[index+1] = cast(ubyte)c.a;
        }
        else if (fmt == PixelFormat.RGB8)
        {
            pixData[index] = cast(ubyte)c.r;
            pixData[index+1] = cast(ubyte)c.g;
            pixData[index+2] = cast(ubyte)c.b;
        }

I notice, that for a large value of x or y, the value of index explodes. I presume that it has something to do with the _pixelSizeparameter. I cant find any documentation on this regard.

Any help would be appreciated.

gecko0307 commented 3 years ago

Yes, it seems that integer overflow occurs while computing index with such large values. I've never tested such extreme cases, so it's possible that images exceeding int.max in size are not manageable adequately. This is a bug, but unfortunately it can't be fixed in no time and needs careful code audit.

gecko0307 commented 1 year ago

I've decided to close this, because allocating and managing arrays that don't fit in memory is outside the scope of dlib.image. While theoretically possible, implementing such a system means a complete overhaul of how dlib handles images. This looks like too much work to support a rarely used/too specific case.