Closed GoogleCodeExporter closed 9 years ago
[deleted comment]
The attached file seems to have manipulated or corrupted imports and exports.
In the export directory, the NumberOfFunctions for the attached file is
0x10015998 (268523928). So, pefile
seems to barf at parse_export_directory, trying to enumerate all the ExportData
and using too much memory.
Screenshot : http://twitpic.com/1r0xlv
However, the NumberOfNames for the attached file is zero. This could be used as
an additional check before
enumerating the exports. These two values (NumberOfNames, NumberOfFunctions)
are not always equal, but
the lesser of the two should be used for most purposes.
temporary fix : NumberOfNames on line 2891 on r73
Original comment by kbandla%...@gtempaccount.com
on 25 May 2010 at 9:31
Thanks for your answer,
I did as you wrote and it works for the attached file, But I have some more
files
which still provide a memory leaks. I have attached additionally two files.
I'm checking with script:
import os
import gc
import pefile
folder = 'folder with tested files'
for f in os.listdir(folder):
try:
p = pefile.PE(os.path.join(folder, f), fast_load=False)
del p
except KeyboardInterrupt:
raise KeyboardInterrupt()
except:
pass
gc.collect()
Original comment by yakovenko87@gmail.com
on 26 May 2010 at 6:29
Attachments:
This seems like expected behavior. You are asking pefile to load all the
directory info (fast_load=False) on files
that have a lot of relocation information. (looks like a common pattern for all
ASPack packed files).
If you are not using directory information from the pe file, using
fast_load=True will speed things up for you.
You can then explicitly call full_load() on the pefile instance to parse all
directories if you need them.
Original comment by kbandla%...@gtempaccount.com
on 26 May 2010 at 5:05
Hello,
I have tried getting info as
p = pefile.PE(os.path.join(folder, f), fast_load=True)
p.parse_data_directories([0,1])
del p
but the test script still has memory leaks. I have tested my script of a large
number
of files and I will try find samples with provide a memory leaks and will send
you.
Who do I think that it isn't expected behavior?
I delete each PE instance explicitly and call python's GC, but memory still
continue
to grow.
Original comment by yakovenko87@gmail.com
on 28 May 2010 at 12:37
(Revision 76) Added an upper bound on the number of export entries that will be
handled. If there are more
entries than what would fit in what's left until the end of the file we don't
attempt to process any more
This makes it possible to handle all the files reported although in some larger
cases it might still take a few
seconds to load a file
Original comment by ero.carr...@gmail.com
on 3 Jun 2010 at 12:05
Original issue reported on code.google.com by
yakovenko87@gmail.com
on 25 May 2010 at 3:39Attachments: