Closed garanews closed 6 years ago
For skipping .zip files I don't think providing that option is high priority at the moment. I think the bigger issue is why is it taking so long on a relatively small disk.
If you want to troubleshoot that let me know. The first question is which parser option did you use (default=win
, or mac
, lin
, or datt
)?
Followed by, how many CPU cores is it using? This is key as each parser uses 1 CPU thread/core until it finishes.
hw is the following: CPU Xeon E5620@ 2.4GHz, total 16 cores. RAM 24GB
command launched:
cdqr.exe -p win --max_cpu --es_kb server1 --es_kb_server 1.2.3.4 --es_kb_port 9200 --no_dependencies_check D:\Temp\server1.dd D:\Temp
output: CDQR Version: 4.1.8 Please enter valid location for Plaso directory: D:\Temp\plaso-20180818 Plaso Version: 20180818 WARNING!! Known compatible version of Plaso NOT detected. Attempting to use defa ult parser list. Try using the --no_dependencies_check if Plaso dependancies are the issue. Using parser: win Number of cpu cores to use: 16
D:/Temp already exists. Would you like to use that directory anyway? [Y/n]: Y Destination Folder: D:/Temp Source data: D:/Temp/server1.dd Log File: D:/Temp/server1.log Database File: D:/Temp/server1.plaso SuperTimeline CSV File: D:/Temp/server1.dd.SuperTimeline.csv
Total start time was: 2018-09-13 08:58:22.770802 Processing started at: 2018-09-13 08:58:22.770802 Parsing image "D:\Temp\plaso-20180818/log2timeline.exe" "--partition" "all" "--vss_stores" "al l" "--status_view" "linear" "--process_archives" "--parsers" "amcache,appcompatc ache,bash,bash_history,bagmru,bencode_transmission,bencode_utorrent,binary_cooki es,ccleaner,chrome_cache,chrome_cookies,chrome_extension_activity,chrome_history ,chrome_preferences,cron,explorer_mountpoints2,explorer_programscache,filestat,f irefox_cache,firefox_cache2,firefox_cookies,firefox_downloads,firefox_history,go ogle_drive,hachoir,java_idx,lnk,mcafee_protection,mft,microsoft_office_mru,micro soft_outlook_mru,mrulist_shell_item_list,mrulist_string,mrulistex_shell_item_lis t,mrulistex_string,mrulistex_string_and_shell_item,mrulistex_string_and_shell_it em_list,msie_zone,msiecf,mstsc_rdp,mstsc_rdp_mru,network_drives,olecf,openxml,op era_global,opera_typed_history,pe,pls_recall,prefetch,recycle_bin,recycle_bin_in fo2,rplog,safari_history,sccm,sophos_av,sqlite,syslog,ssh,symantec_scanlog,usera ssist,usnjrnl,windows_boot_execute,windows_boot_verify,windows_run,windows_sam_u sers,windows_services,windows_shutdown,windows_task_cache,windows_timezone,windo ws_typed_urls,windows_usb_devices,windows_usbstor_devices,windows_version,winevt ,winevtx,winfirewall,winjob,winlogon,winrar_mru,winreg,winreg_default,xchatlog,x chatscrollback" "--hashers" "md5" "--workers" "16" "D:/Temp/server1.dd.plaso" "D: /Temp/server1.dd" "--no_dependencies_check" Parsing ended at: 2018-10-02 17:45:27.557733 Parsing duration was: 19 days, 8:47:04.786931
Wow. That is the longest running job and largest resulting .plaso file I've heard of / seen to date.
I see now why that feature is important to some people. I think the option could be added to the list of arguments as a single flag to not include "--process_archives"
Are you interested in contributing to add it in? It would be very similar to this PR https://github.com/orlikoski/CDQR/pull/28
It works! Thanks teammate
Nice! I'll review the PR tonight and thanks for putting that in!
Testing complete and merged to master as v4.2.1
Thanks for doing that work to make it better for everyone!
I'm running cdqr since Sept 13th against 80GB disk. In the log file I see log2timeline is processing java/zip files and it is taking so long. Sometime it would be helpful to skip the analysis of compressed files. What do you think?