kevoreilly / CAPEv2

Malware Configuration And Payload Extraction
https://capesandbox.com/analysis/
Other
1.89k stars 407 forks source link

Memory analysis not working #1479

Closed dell224 closed 7 months ago

dell224 commented 1 year ago

About accounts on capesandbox.com

This is open source and you are getting free support so be friendly!

Prerequisites

Please answer the following questions for yourself before submitting an issue.

Expected Behavior

Memory analysis successfully performed on the submitted samples using volatility3.

Current Behavior

The analysis report currently does not generate process memory and process dumps report despite procmemdump=1.

Failure Information (for bugs)

From journalctl -u cape.service, I can see 1 (one) missed dependencies and 1 (one) error pertaining to JsonRenderer.

Apr 11 03:59:00 cape-sandbox python3[1880]:  Copyright (c) 2010-2015
Apr 11 03:59:00 cape-sandbox python3[1880]:  CAPE: Config and Payload Extraction
Apr 11 03:59:00 cape-sandbox python3[1880]:  github.com/kevoreilly/CAPEv2
Apr 11 03:59:01 cape-sandbox python3[1880]: OPTIONAL! Missed dependency: pip3 install https://github.com/CAPESandbox/peepdf/archive/20eda78d7d77fc5b3b652ffc2d8a5b0af796e3dd.zip#egg=peepdf==0.4.2
Apr 11 03:59:01 cape-sandbox python3[1880]: OPTIONAL! Missed dependency: pip3 install -U git+https://github.com/DissectMalware/batch_deobfuscator
Apr 11 03:59:01 cape-sandbox python3[1880]: Missed dependency: pip3 install volatility3 -U
Apr 11 03:59:01 cape-sandbox python3[1880]: name 'JsonRenderer' is not defined
Apr 11 03:59:01 cape-sandbox python3[1880]: OPTIONAL! Missed dependency: pip3 install -U git+https://github.com/CAPESandbox/httpreplay
Apr 11 03:59:02 cape-sandbox python3[1880]: INFO:lib.cuckoo.core.scheduler:Using "kvm" machine manager with max_analysis_count=0, max_machines_count=10, and max_vmstartup_count=5

Steps to Reproduce

Please provide detailed steps for reproducing the issue.

  1. Setup CAPEv2 following the steps outlined in capev2.readthedocs.io
  2. Setup the guest VM with CAPE agent installed.
  3. Ensure memory analysis is enabled in the config file in the 'custom/conf'
  4. Submit a true positive sample to through the web interface with the option "Full process memory dumps" enabled. Network not enabled.

Context

Please provide any relevant information about your setup. This is important in case the issue is not reproducible except for under certain conditions. Operating system version, bitness, installed software versions, test sample details/hash/binary (if applicable).

Question Answer
Git commit Type $ git log \| head -n1 to find out
Host OS version Ubuntu 22.04.2 LTS
Guest OS version Windows 10 x64

custom/conf/Processing.conf

#Enable or disable the available processing modules [on/off].
# If you add a custom processing module to your Cuckoo setup, you have to add
# a dedicated entry in this file, or it won't be executed.
# You can also add additional options under the section of your module and
# they will be available in your Python class.

# exclude files that doesn't match safe extension and ignore their files from processing inside of other modules like CAPE.py
[antiransomware]
enabled = no
# ignore all files with extension found more than X
skip_number = 30

[curtain]
enabled = no

[sysmon]
enabled = no

[analysisinfo]
enabled = yes

# FLARE capa -> to update rules utils/community.py -cr
# install -> cd /tmp && git clone --recurse-submodules https://github.com/fireeye/capa.git && cd capa && git submodule update --init rules && python -m pip3 install .
[flare_capa]
enabled = yes
# Generate it always or generate on demand only(user need to click button to generate it), still should be enabled to use this feature on demand
on_demand = no
# Analyze binary payloads
static = yes
# Analyze CAPE payloads
cape = no
# Analyze ProcDump
procdump = yes

[decompression]
enabled = no

[dumptls]
enabled = no

[behavior]
enabled = yes
# Toggle specific modules within the BehaviorAnalysis class
anomaly = yes
processtree = yes
summary = yes
enhanced = yes
encryptedbuffers = yes

[debug]
enabled = yes

[detections]
enabled = yes
# Signatures
behavior = yes
yara = yes
suricata = yes
virustotal = no
clamav = no

# ... but this mechanism may still be switched on
[procmemory]
enabled = yes
strings = yes

[procmon]
enabled = yes

[memory]
enabled = yes

[usage]
enabled = no

[network]
enabled = yes
sort_pcap = no
# DNS whitelisting to ignore domains/IPs configured in network.py
# This should be disabled when utilizing InetSim/Remnux as we end up resolving
# the IP from fakedns which would then remove all domains associated with that
# resolved IP
dnswhitelist = yes
# additional entries
dnswhitelist_file = extra/whitelist_domains.txt
ipwhitelist = yes
ipwhitelist_file = extra/whitelist_ips.txt

# Requires geoip2 and maxmind database
country_lookup = no
# Register and download for free from https://www.maxmind.com/
maxmind_database = data/GeoLite2-Country.mmdb

# Should the server use a compressed version of behavioural logs? This helps
# in saving space in Mongo, accelerates searchs and reduce the size of the
# final JSON report.
[loop_detection]
enabled = no

[url_analysis]
enabled = yes
# Enable a WHOIS lookup for the target domain of a URL analyses
whois = yes

[strings]
enabled = yes
on_demand = no
nullterminated_only = no
minchars = 5

[trid]
# Specify the path to the trid binary to use for static analysis.
enabled = no
identifier = data/trid/trid
definitions = data/trid/triddefs.trd

[die]
# Detect it Easy
enabled = no
binary = /usr/bin/diec

[virustotal]
enabled = yes
on_demand = no
timeout = 60
# remove empty detections
remove_empty = yes
# Add your VirusTotal API key here. The default API key, kindly provided
# by the VirusTotal team, should enable you with a sufficient throughput
# and while being shared with all our users, it shouldn't affect your use.
key = a0283a2c3d55728300d064874239b5346fb991317e8449fe43c902879d758088
do_file_lookup = yes
do_url_lookup = yes
urlscrub = (^http:\/\/serw\.clicksor\.com\/redir\.php\?url=|&InjectedParam=.+$)

[suricata]
# Notes on getting this to work check install_suricata function:
# https://github.com/doomedraven/Tools/blob/master/Sandbox/cape2.sh

enabled = yes
#Runmode "cli" or "socket"
runmode = socket
#Outputfiles
# if evelog is specified, it will be used instead of the per-protocol log files
evelog = eve.json

# per-protocol log files
#
#alertlog = alert.json
#httplog = http.json
#tlslog = tls.json
#sshlog = ssh.json
#dnslog = dns.json

fileslog = files-json.log
filesdir = files
# Amount of text to carve from plaintext files (bytes)
buffer = 8192
#Used for creating an archive of extracted files
7zbin = /usr/bin/7z
zippass = infected
##Runmode "cli" options
bin = /usr/bin/suricata
conf = /etc/suricata/suricata.yaml
##Runmode "socket" Options
socket_file = /tmp/suricata-command.socket

[cif]
enabled = no
# url of CIF server
url = https://your-cif-server.com/api
# CIF API key
key = your-api-key-here
# time to wait for server to respond, in seconds
timeout = 60
# minimum confidence level of returned results:
# 25=not confident, 50=automated, 75=somewhat confident, 85=very confident, 95=certain
# defaults to 85
confidence = 85
# don't log queries by default, set to 'no' to log queries
nolog = yes
# max number of results per query
per_lookup_limit = 20
# max number of queries per analysis
per_analysis_limit = 200

[CAPE]
enabled = yes
# Ex targetinfo standalone module
targetinfo = yes
# Ex dropped standalone module
dropped = yes
# Ex procdump standalone module
procdump = yes
# Amount of text to carve from plaintext files (bytes)
buffer = 8192
# Process files not bigger than value below in Mb. We saw that after 90Mb it has biggest delay
max_file_size = 90
# Scan for UserDB.TXT signature matches
userdb_signature = no

# Deduplicate screenshots
# You need to install dependency ImageHash = "4.2.1" or newer
[deduplication]
#
# Available hashs functions:
#  ahash:      Average hash
#  phash:      Perceptual hash
#  dhash:      Difference hash
#  whash-haar: Haar wavelet hash
#  whash-db4:  Daubechies wavelet hash
enabled = no
hashmethod = ahash

[vba2graph]
# Mac - brew install graphviz
# Ubuntu - sudo apt-get install graphviz
# Arch - sudo pacman -S graphviz+
# sudo pip3 install networkx>=2.1 graphviz>=0.8.4 pydot>=1.2.4
enabled = yes
on_demand = yes

# ja3 finger print db with descriptions
# https://github.com/trisulnsm/trisul-scripts/blob/master/lua/frontend_scripts/reassembly/ja3/prints/ja3fingerprint.json
[ja3]
ja3_path = data/ja3/ja3fingerprint.json

[maliciousmacrobot]
# https://maliciousmacrobot.readthedocs.io
# Install mmbot
#   sudo pip3 install mmbot
# Create/Set required paths
# Populate benign_path and malicious_path with appropriate macro maldocs (try the tests/samples in the github)
#   https://github.com/egaus/MaliciousMacroBot/tree/master/tests/samples
# Create modeldata.pickle with your maldocs (this does not append to the model, it overwrites it)
#
#   mmb = MaliciousMacroBot(benign_path, malicious_path, model_path, retain_sample_contents=False)
#   result = mmb.mmb_init_model(modelRebuild=True)
#
# Copy your model file and vocab.txt to your model_path
enabled = no
benign_path = /opt/cuckoo/data/mmbot/benign
malicious_path = /opt/cuckoo/data/mmbot/malicious
model_path = /opt/cuckoo/data/mmbot/model

[xlsdeobf]
# pip3 install git+https://github.com/DissectMalware/XLMMacroDeobfuscator.git
enabled = no
on_demand = no

[boxjs]
enabled = no
timeout = 60
url = http://your_super_box_js:9000

# Extractors
[mwcp]
enabled = yes
modules_path = modules/processing/parsers/mwcp/

[ratdecoders]
enabled = yes
modules_path = modules/processing/parsers/RATDecoders/

[malduck]
enabled = yes
modules_path = modules/processing/parsers/malduck/

[CAPE_extractors]
enabled = yes
# Must ends with /
modules_path = modules/processing/parsers/CAPE/

[reversinglabs]
enabled = no
url =
key =

[script_log_processing]
enabled = yes

# Dump PE's overlay info
[overlay]
enabled = no

[floss]
enabled = no
on_demand = no
static_strings = no
stack_strings = yes
decoded_strings = yes
tight_strings = yes
min_length = 5
# Download FLOSS signatures from https://github.com/mandiant/flare-floss/tree/master/sigs
sigs_path = data/flare-signatures

custom/conf/Reporting.conf

# Enable or disable the available reporting modules [on/off].
# If you add a custom reporting module to your Cuckoo setup, you have to add
# a dedicated entry in this file, or it won't be executed.
# You can also add additional options under the section of your module and
# they will be available in your Python class.

[cents]
enabled = no
on_demand = no
# starting signature id for created Suricata rules
start_sid = 1000000

[mitre]
enabled = yes

# https://github.com/geekscrapy/binGraph
# requires -> apt-get install python-tk
[bingraph]
enabled = yes
on_demand = yes
binary = yes
# geenrate bingraphs for cape/procdumps
cape = yes
procdump = yes

[pcap2cert]
enabled = yes

[litereport]
enabled = no
keys_to_copy = CAPE procdump info signatures dropped static target network shot malscore ttps
behavior_keys_to_copy = processtree summary

[reportbackup]
enabled = no
# External service to use
googledrive = no
# Specify the ID of the shared Google Drive Folder where reports will be backed up to
# Replace folder ID with own Google Drive shared folder (share access to created service account)
# Without service account, upload process cannot complete due to browser not being able to launch
drive_folder_id = id_here
drive_credentials_location = data/google_creds.json

[jsondump]
enabled = yes
# use the c-optimized JSON encoder, requires fitting entire JSON results in memory
ram_boost = no
indent = 4
encoding = latin-1

[reporthtml]
# required for the WSGI interface
enabled = no

[reporthtmlsummary]
# much smaller, faster report generation, omits API logs and is non-interactive
enabled = no

[reportpdf]
# Note that this requires reporthtmlsummary to be enabled above as well
enabled = no

[maec41]
enabled = no
mode = overview
processtree = true
output_handles = false
static = true
strings = true
virustotal = true
deduplicate = true

[maec5]
enabled = no

[mongodb]
enabled = yes
host = 127.0.0.1
port = 27017
db = cuckoo
# Set those values if you are using mongodb authentication
# username =
# password =
# authsource = cuckoo

# Set this value if you are using mongodb with TLS enabled
# tlscafile =

# Automatically delete large dict values that exceed mongos 16MB limitation
# Note: This only deletes dict keys from data stored in MongoDB. You would
# still get the full dataset if you parsed the results dict in another
# reporting module or from the jsondump module.
fix_large_docs = yes

# ES is not officially supported by core dev and relays on community
# Latest known working version is 7.16.2
# Use ElasticSearch as the "database" which powers Django.
# NOTE: If this is enabled, MongoDB should not be enabled, unless
# search only option is set to yes. Then elastic search is only used for /search web page.
[elasticsearchdb]
enabled = no
searchonly = no
host = 127.0.0.1
port = 9200
# The report data is indexed in the form of {{index-yyyy.mm.dd}}
# so the below index configuration option is actually an index 'prefix'.
index = cuckoo
# username =
# password =
# use_ssl =
# verify_certs =

[retention]
enabled = no
# run at most once every this many hours (unless reporting.conf is modified)
run_every = 6
# The amount of days old a task needs to be before deleting data
# Set a value to no to never delete it
memory = 14
procmemory = 62
pcap = 62
sortedpcap = 14
bsonlogs = 62
dropped = 62
screencaps = 62
reports = 62
mongo = 731
elastic = no

[syslog]
enabled = no
# IP of your syslog server/listener
host = x.x.x.x
# Port of your syslog server/listener
port = 514
# Protocol to send data over
protocol = tcp
# Store a logfile? [in reports directory]
logfile = yes
# if yes, what logname? [Default: syslog.txt]
logname = syslog.log

[moloch]
enabled = no
base = https://172.18.100.105:8005/
node = cuckoo3
capture = /data/moloch/bin/moloch-capture
captureconf = /data/moloch/etc/config.ini
user = admin
pass = admin
realm = Moloch

[resubmitexe]
enabled = no
resublimit = 5

[compression]
enabled = yes
zipmemdump = yes
zipmemstrings = yes
zipprocdump = yes
zipprocstrings = yes

[misp]
enabled = no
apikey =
url =
#Make event published after creation?
published = no
# minimal malscore, by default all
min_malscore = 0
# by default 5 threads
threads =
# this will retrieve information for iocs
# and activate misp report download from webgui
extend_context = no
# upload iocs from cuckoo to MISP
upload_iocs = no
distribution = 0
threat_level_id = 2
analysis = 2
# Sections to report
# Analysis ID will be appended, change
title = Iocs from cuckoo analysis:
network = no
ids_files = no
dropped = no
registry = no
mutexes = no

[callback]
enabled = no
# will send as post data {"task_id":X}
# can be coma separated urls
url = http://IP/callback

# Compress results including CAPE output
# to help avoid reaching the hard 16MB MongoDB limit.
[compressresults]
enabled = yes

[tmpfsclean]
enabled = no
key = tr_extractor

# This calls the specified command, pointing it at the report.json as
# well as setting $ENV{CAPE_TASK_ID} to the task ID of the run in question.
#
[zexecreport]
enabled=no
command=/foo/bar.pl

# run statistics, this may take more times.
[runstatistics]
enabled = no

[malheur]
enabled = no

Failure Logs

Please include any relevant log snippets or files here.

process.log

2023-04-11 03:59:01,801 [root] INFO: Processing analysis data
2023-04-11 04:46:01,318 [root] INFO: Processing analysis data for Task #10
2023-04-11 04:46:01,787 [Task 10] [vivisect.analysis] INFO: Vivisect Analysis Setup Hooks Complete
2023-04-11 04:46:01,788 [Task 10] [vivisect] INFO: Beginning analysis...
2023-04-11 04:46:01,788 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.generic.entrypoints
2023-04-11 04:46:01,797 [Task 10] [envi.codeflow] WARNING: parseOpcode error at 0x0040165a (addCodeFlow(0x4013a0)): InvalidInstruction("'ffff080000000100000002000000e900' at 0x40165a")
2023-04-11 04:46:01,802 [Task 10] [vivisect.tools.graphutil] WARNING: FB is None in graph building!??!
2023-04-11 04:46:01,802 [Task 10] [vivisect.tools.graphutil] WARNING: (fva: 0x004013a0  fallva: 0x0040165a
2023-04-11 04:46:01,802 [Task 10] [vivisect.tools.graphutil] WARNING: FB is None in graph building!??!
2023-04-11 04:46:01,802 [Task 10] [vivisect.tools.graphutil] WARNING: (fva: 0x004013a0  fallva: 0x0040165a
2023-04-11 04:46:01,803 [Task 10] [vivisect.analysis.generic.noret] INFO: Marking 0x004013a0 as no return
2023-04-11 04:46:01,803 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.pe
2023-04-11 04:46:01,803 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.generic.relocations
2023-04-11 04:46:01,803 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.ms.vftables
2023-04-11 04:46:01,980 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.generic.emucode
2023-04-11 04:46:01,980 [Task 10] [vivisect] INFO: emucode: 0 new functions defined (now total: 2)
2023-04-11 04:46:01,980 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.i386.importcalls
2023-04-11 04:46:01,988 [Task 10] [vivisect] INFO: 0x00402695: Emulation Found 0x00401250 (from func: 0x00402680) via call eax
2023-04-11 04:46:01,997 [Task 10] [envi.codeflow] WARNING: parseOpcode error at 0x0041e2d9 (addCodeFlow(0x41db6c)): InvalidInstruction("'feffff518d95fcfeffff528d850cffff' at 0x41e2d9")
2023-04-11 04:46:02,698 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.i386.golang
2023-04-11 04:46:02,698 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.ms.localhints
2023-04-11 04:46:02,698 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.generic.funcentries
2023-04-11 04:46:02,928 [Task 10] [vivisect.tools.graphutil] WARNING: FB is None in graph building!??!
2023-04-11 04:46:02,928 [Task 10] [vivisect.tools.graphutil] WARNING: (fva: 0x0041da40  fallva: 0x0041e2d9
2023-04-11 04:46:03,120 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.ms.msvcfunc
2023-04-11 04:46:03,120 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.generic.thunks
2023-04-11 04:46:03,120 [Task 10] [vivisect] INFO: Extended Analysis: vivisect.analysis.generic.strconst
2023-04-11 04:46:03,142 [Task 10] [vivisect] INFO: ...analysis complete! (1 sec)
2023-04-11 04:46:03,187 [Task 10] [vivisect] INFO: Percentage of discovered executable surface area: 34.0% (59893 / 176128)
2023-04-11 04:46:03,188 [Task 10] [vivisect] INFO:    Xrefs/Blocks/Funcs:                             (6129 / 833 / 15)
2023-04-11 04:46:03,188 [Task 10] [vivisect] INFO:    Locs,  Ops/Strings/Unicode/Nums/Ptrs/Vtables:   (16430:  16204 / 0 / 122 / 13 / 0 / 0)
2023-04-11 04:46:10,723 [Task 10] [modules.processing.network] WARNING: The PCAP file does not exist at path "/opt/CAPEv2/storage/analyses/10/dump.pcap"
2023-04-11 04:47:30,696 [root] INFO: Reports generation completed
kevoreilly commented 1 year ago

I think there is some confusion here. "Full process memory dumps" (procmemdump) are not the same thing as a full system dump of the vm memory which is what is required for volatility analysis.

The full process memory dump submission option causes each process to be dumped in full which is displayed in the "Process Memory" tab:

image

For volatility analysis I am not an expert as this is not part of cape that I use. However, the full system dumps that are required for volatility are enabled in cuckoo.conf as follows:

# Enable creation of memory dump of the analysis machine before shutting
# down. Even if turned off, this functionality can also be enabled at
# submission. Currently available for: VirtualBox and libvirt modules (KVM).
memory_dump = on

Then the configuration of volatility is done in memory.conf.

doomedraven commented 1 year ago

what volatility version do you use? as you might see in errors there is 2 things, vol is not installed msg and render error as it can't import due to not installed or bad version

dell224 commented 1 year ago

Hi doomedraven,

It is volatility version 2.4.2.

I attempted to run the command suggested in the error pip3 install volatility3 -U, and this is the result:

Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: volatility3 in /usr/local/lib/python3.10/dist-packages (2.4.2)
Requirement already satisfied: pefile>=2017.8.1 in /usr/local/lib/python3.10/dist-packages (from volatility3) (2022.5.30)
Requirement already satisfied: future in /usr/lib/python3/dist-packages (from pefile>=2017.8.1->volatility3) (0.18.2)
doomedraven commented 7 months ago

somehow i missed that, you can't install package with pip if you using poetry, you need to install it with poetry run pip install volatility3