MarkBaggett / srum-dump

A forensics tool to convert the data in the Windows srum (System Resource Usage Monitor) database to an xlsx spreadsheet.
GNU General Public License v3.0
594 stars 97 forks source link

Unable to write XLS file to disk #5

Closed MarkBaggett closed 6 years ago

MarkBaggett commented 7 years ago

I was given an example of a SRUM.DAT file that I am unable to process. (Thank you Martin WIlling!) While processing the file the tool produces the following output:

Unable to find table Undocumented Windows 10 Table {97C2CE28-A37B-4920-B1E9-8B76CD341EC5}
I was unable to write the output file.  Do you have an old version open?  If not this is probably a path or permissions issue.
Error :
Finished!

The first line "Unable to find table ..." isn't a problem. This is just a warning that indicates that the SRUM file you are analyzing doesn't contain that table. No big deal. The next line "I was unable to write the output file..." is a problem. In this case the error is not caused by permissions or the path as the error message indicates. The problem is that the system is running out of Memory. Unfortunately MemoryErrors can not be always be caught in Python (See docs https://docs.python.org/2/library/exceptions.html#exceptions.MemoryError ) so the error message returned by the program isn't at all helpful. Usually after the word "ERROR:" you would see what the error is. If NOTHING appears after the word "ERROR:" then you are probably having the same issue. When I run this through the debugger I can clearly see that it is a memory error...

Creating Sheet Energy Usage
While you wait, did you know ...
This program was written by Twitter:@markbaggett and @donaldjwilliam5 because @ovie said so.

Unable to find table Undocumented Windows 10 Table {97C2CE28-A37B-4920-B1E9-8B76CD341EC5}
> c:\host\documents\pythonprojects\srum\srum-dump\srum_dump.py(365)<module>()
-> firstsheet=target_wb.get_sheet_by_name("Sheet")
(Pdb)
(Pdb) target_wb.save(".\\outtest.xlsx")
***** MemoryError:**
(Pdb)

There were some recent changes to openpyxl in dealing with memory usage and large spreadsheets. The current release was using openpyxl (2.4.7). This is an open issue that I will have to resolve. I'm working on it now.

Mask022 commented 6 years ago

Hi Mark, I received the above error when I run the program. Your assigned task was created on Jul 10 was that of this year? The most recent exe is posted from 1 yr ago so I am running the most recent. Is there any workaround or have you been able to solve the issue. Thank you! 19 Oct 17

MarkBaggett commented 6 years ago

Hi. I committed a CSV only version of the tool a minute ago. If you wouldn't mind giving that a try and seeing if it resolves your issue I would appreciate it. You lose XLSX and some of the features it offers but hopefully it resolves the CRASH that some users are experiencing with very large Application Event tables. Email my gmail account directly at lo127001 if you wouldn't mind sharing the results and or a copy of the SRUDB.DAT file that is causing the crash.

Thanks, Mark

MarkBaggett commented 6 years ago

I've released a new version of SRUM_DUMP that creates CSV files instead of XLSX. This avoids the memory error that some of you have experienced when writing XLSX files to disk. CSV = No EXCEL == no crash.

Note that the current template has some fields that contain XLSX formulas (Total bytes, etc) and formatting. This will not do anything in a CSV file. You will want to open the XLSX_TEMPLATE and remove any calculated fields or delete the columns in the resulting CSV files.