Open mmeloni opened 3 years ago
Hello,
This does not look like a PE file at all. So, the best you could expect is a graceful error instead of a panic, such as "not a valid PE file".
Here is a shorter reproducer:
package main
import (
"bytes"
"debug/pe"
)
func main() {
pe.NewFile(bytes.NewReader([]byte{
0x08: 0x10, 0x00, 0x00, 0x00, 0x71, 0x1C, 0xC7, 0xF1, 0x04,
0xFF: 0,
}))
}
Thanks @tc-hib, yes a graceful error is what is needed in this case.
Not sure what we could do here. PE files don't have anything resembling a magic number in the header, so it's hard to tell if it really is a PE file. The only thing we can do is check the machine makes sense, which we do.
It's unclear to me whether the MS-DOS stub is optional or not. If people always use it, maybe we can check that.
I guess we could check that the allocation sizes are reasonable. This particular example reports way more symbols than the file could possibly contain, and we dutifully try to allocate storage for them.
I don't know why this function seems to accept PE files which would directly start with the file header. I thought both the DOS stub and the PE signature were mandatory.
The optional header should be read sooner because it starts with a magic number.
I don't know what this function is used for, and I know nothing about system programming, but is it acceptable that a corrupt file may cause a useless allocation of several gigabytes? Shouldn't the function detect that it would read past the end of the file?
Or at least use a temporary bytes.Buffer
and io.CopyN
to get a chance to meet EOF before reaching that size.
I guess we could check that the allocation sizes are reasonable. This particular example reports way more symbols than the file could possibly contain, and we dutifully try to allocate storage for them.
But I am wondering how could we get the size of the read bytes array with io.ReaderAt
?
If we can get the size of the input byte array, then I think this solution is good
@HowJMay I guess you can try to read the last byte before actually reading the whole thing.
Or you can mimic io.CopyN
, or directly use it:
buf := bytes.Buffer{}
_, err = io.CopyN(&buf, r, COFFSymbolSize*int64(fh.NumberOfSymbols))
if err != nil {
return nil, fmt.Errorf("fail to read symbol table: %v", err)
}
syms := make([]COFFSymbol, fh.NumberOfSymbols)
err = binary.Read(&buf, binary.LittleEndian, syms)
(disclaimer: I'm a beginner, trying to learn as well as helping)
@tc-hib I like what you suggested, and I tested well. However, I think maybe we should use LimitReader()
and Copy()
directly?
Change https://golang.org/cl/286113 mentions this issue: debug/pe: fix OOM caused by huge NumberOfSymbols
@tc-hib I like what you suggested, and I tested well. However, I think maybe we should use
LimitReader()
andCopy()
directly?
What would the benefit be?
If this function isn't used elsewhere you can also pass it the original ReaderAt
and do something like this:
size := int64(fh.NumberOfSymbols)*COFFSymbolSize
_, err := r.ReadAt([]byte{0}, int64(fh.PointerToSymbolTable) + size - 1)
if err != nil {
return nil, fmt.Errorf("fail to read to symbol table: %v", err)
}
syms := make([]COFFSymbol, fh.NumberOfSymbols)
err = binary.Read(io.NewSectionReader(r, int64(fh.PointerToSymbolTable), size), binary.LittleEndian, syms)
What version of Go are you using (
go version
)?Does this issue reproduce with the latest release?
Yes
What operating system and processor architecture are you using (
go env
)?go env
OutputWhat did you do?
To reproduce download and unzip 000182.zip
What did you expect to see?
I expected that the method returns a valid
pe.FIle
What did you see instead?
cc @vchaindz