Open 0x6d696368 opened 5 years ago
Did you mean
virtualSize
orrawDataSize => 0x80000000
and
0x7fffffff
@ryanmkurtz yes, of course.
The only sample (constructed; not real in the wild sample) I found was bigSoRD.exe
from https://github.com/indrora/corkami/tree/master/src/PE
Its a PE with oversized SizeOfRawData (0xFFFF0200). It doesn't get loaded correctly (it will take the virtualSize though as that is bigger than the then incorrectly negative rawDataSize). If I change virtualSize on that sample nothing gets loaded at all. So the code handles this without crashing. But also without warning.
Ok, thanks for the reference file. I was having trouble creating sections this large with Visual Studio.
I finally got around to looking at bigSoRD.exe. Do you know what the real Windows loader does with this? If Windows does not load it, I think the best thing we can do is to just throw a more descriptive exception if we detected SizeOfRawData overflow.
According to x64dbg it gets loaded as one 4k page:
Address Size Info Content Typ Prote Initi
00400000 00001000 bigsord.exe IMG -R--- ERWC-
00401000 00001000 "" IMG ERW-- ERWC-
00402000 00001000 "" IMG -R--- ERWC-
(Please don't ask me why, though.)
Also Ghidra does indeed warn in the status bar:
Section[0] has invalid size ffff0200 (1000)
This can be accessed via the logs. (Didn't know that before.) So I guess this is OK from a non-crashing and feedback perspective.
However, the problem is that because Windows loads the data section, but Ghidra doesn't, Ghidra does not see the string at DAT_00402000
because that address is not found in program.
I guess this can be changed to improvement rather than bug. As it is low priority.
The PE header fields are parsed as signed and not unsigned see: https://github.com/NationalSecurityAgency/ghidra/blob/49c2010b63b56c8f20845f3970fedd95d003b1e9/Ghidra/Features/Base/src/main/java/ghidra/app/util/bin/format/pe/SectionHeader.java#L493
If now e.g.
virtualSize
orrawDataSize => 0x10000000
then the read in value becomes negative. This means sections larger than0x0fffffff
bytes can not be read or allocated. Impact of other header fields becoming negative is unknown.Two possible solutions:
As this doesn't seem to cause immediate problems (>2GiB PE files are rare and also the other fields seem to rarely reach overflow limit, plus PE parsing seems to work just fine as is) I'd rate the severity of this very low. However as #327 could lead to frustrating symptoms. Hence, I open this issue.
Other testcases to harden the PE loader see: https://github.com/NationalSecurityAgency/ghidra/pull/418#issuecomment-486889500