kyz / libmspack

A library for some loosely related Microsoft compression formats, CAB, CHM, HLP, LIT, KWAJ and SZDD.
https://www.cabextract.org.uk/libmspack/
169 stars 45 forks source link

memory exhausted in chmd_read_headers() #26

Closed JsHuang closed 5 years ago

JsHuang commented 5 years ago

Description:

​ function chmd_read_headers() in libmspack has a memory exhausted problem

Affected version:

​ libmspack 0.9.1 alpha

Details:

​ Critical code(in chmd.c):

line 346~352:

  chm->chunk_size = EndGetI32(&buf[chmhs1_ChunkSize]);
// chm->chunk_size is directly from input file and can be controlled by attackers for malicious usage(memory exhaustion )
  chm->density    = EndGetI32(&buf[chmhs1_Density]);
  chm->depth      = EndGetI32(&buf[chmhs1_Depth]);
  chm->index_root = EndGetI32(&buf[chmhs1_IndexRoot]);
  chm->num_chunks = EndGetI32(&buf[chmhs1_NumChunks]);
  chm->first_pmgl = EndGetI32(&buf[chmhs1_FirstPMGL]);
  chm->last_pmgl  = EndGetI32(&buf[chmhs1_LastPMGL]);
...
line 418~420:

 if (!(chunk = (unsigned char *) sys->alloc(sys, (size_t)chm->chunk_size))) {
     // chm->chunk_size not checked
    return MSPACK_ERR_NOMEMORY;
  }

​ In function chmd_read_headers() in file chmd.c,chm->chunk_size was read from chm file and lately allocate memory of size chm->chunk_size, without check whether chm->chunk_size is valid, Carefully constructed chm file will lead to memory exhausted problem.

chm->chunk_size is 32bit, it can be as large as 0xffffffff. The maximum memory usage of chmd_read_headers() can be 4G RAM, even if the input file is very small.

poc file

https://github.com/JsHuang/pocs/blob/master/libmspack/oom-chm

Credit: ADLab of Venustech

kyz commented 5 years ago

Thanks for reporting this.

As per issue #25, this is not a security issue. It's up to code controlling libmspack to set hard memory usage limits, if such limits are desired. Use a custom mspack_system.alloc() function to return NULL to libmspack if it goes over your arbitrary memory limit, instead of allocating the memory. libmspack will follow through and return MSPACK_ERR_NOMEMORY to the client.

However, I agree with adding another validation here, to reject bad CHM files. chmhs1_ChunkSize commonly only ever has one value, 4096. All files created with hhc.exe have this value. I have a single file with a larger chunk size (8192) and it was created by someone developing a competing CHM file creator, which they've since abandoned.

I've set an upper limit of 8192 for this field in commit 3b106e2a284bafde0448a14b8a4bf37747bbac4b.

If anyone has valid CHM files where this field is larger, please write to me.