Open wdormann opened 5 years ago
You're correct. It's scanning the files one-by-one and adding the info to a data table and holding that in memory until it's done. Not the most efficient thing in the world. I can try to refactor it so it's more efficient in it's file handling. I honestly never needed to scan that many files so it's never come up before. However, the error your getting indicates that it may be trying to read a file that's too big. I know that can probably be fixed rather easily by using FileStreams instead of reading everything in at once.
See my PR #15
I guess this issue can be closed if PR was merged? Alternatively to provide the option to output the table line by line to a log file, so nothing is needed to be kept in memory
It seems reasonable to do a scan of the executable code on an entire system. To do this, I ran: Get-PESecurity -Directory 'C:\' -Recursive > pesecurity.log
Eventually the script started throwing errors:
I'm not too keen on powershell to know where the problem is, but just looking at the memory usage of PowerShell when doing a recursive directory scan, it is mostly only increasing. I'd think that ideally as it crawls through the directories it analyzes a binary one at a time, outputting the results and releasing any allocated memory/objects for each one. The current behavior never actually gets to the point of outputting any results because it runs out of memory before it gets to that point.