kevoreilly / CAPEv2

Malware Configuration And Payload Extraction
https://capesandbox.com/analysis/
Other
2.02k stars 425 forks source link

sample submission shows error " TypeError: 'NoneType' object is not iterable" #1506

Closed musman12362 closed 1 year ago

musman12362 commented 1 year ago

About accounts on capesandbox.com

This is open source and you are getting free support so be friendly!

Prerequisites

Please answer the following questions for yourself before submitting an issue.

Expected Behavior

Please describe the behavior you are expecting. If your samples(x64) stuck in pending ensure that you set tags=x64 in hypervisor conf for x64 vms

I'm expectiing that when I should submit a sample file cape should process it and provide me a analysis report but when i submit a sample I get error "TypeError: 'NoneType' object is not iterable"

when we click on error it shows details like this

**""" File "/opt/CAPEv2/web/submission/views.py", line 396, in index

            if status == "error":

                details["errors"].append({os.path.basename(path): task_ids_tmp})

            else:

                if web_conf.general.get("existent_tasks", False):

                    records = perform_search("target_sha256", sha256, search_limit=5)

                    for record in records:

                        if record.get("target").get("file", {}).get("sha256"):

                            existent_tasks.setdefault(record["target"]["file"]["sha256"], []).append(record)

                details["task_ids"] = task_ids_tmp

[Open an interactive python shell in this frame]

    elif task_category == "static":**

""

i'll also share its screenshot . but i back end terminal cape shows that sample is running on guest machine. in terminal it shows that sample is submitted also do processing and shows report but when i go to web interface it shows nothing.

in recent tab it shows that task is report but when i click on it for anlaysis report it shows error

"ERROR :-( The specified analysis does not exist or not finished yet "

i want solution please help me out in this I'll be very thankful to you

Current Behavior

What is the current behavior?

when i submit sample file i get an error TypeError: 'NoneType' object is not iterable"

Failure Information (for bugs)

Please help provide information about the failure if this is a bug. If it is not a bug, please remove the rest of this template.

Steps to Reproduce

Please provide detailed steps for reproducing the issue.

  1. step 1
  2. step 2
  3. you get it...

Context

Please provide any relevant information about your setup. This is important in case the issue is not reproducible except for under certain conditions. Operating system version, bitness, installed software versions, test sample details/hash/binary (if applicable).

Question Answer
Git commit Type $ git log \| head -n1 to find out
OS version Ubuntu 22.04 as host , Windows 10 as analysis machine

Failure Logs

2023-05-05 06:49:15,652 [lib.cuckoo.core.scheduler] INFO: Using "kvm" machine manager with max_analysis_count=10, max_machines_count=5, and max_vmstartup_count=5 2023-05-05 06:49:15,676 [lib.cuckoo.common.abstracts] DEBUG: Getting status for win10 2023-05-05 06:49:15,689 [lib.cuckoo.core.scheduler] INFO: Loaded 1 machine/s 2023-05-05 06:49:15,696 [lib.cuckoo.core.scheduler] INFO: Waiting for analysis tasks 2023-05-05 06:54:10,943 [lib.cuckoo.core.scheduler] DEBUG: Task #4: Processing task 2023-05-05 06:54:11,016 [lib.cuckoo.core.scheduler] INFO: Task #4: File already exists at '/opt/CAPEv2/storage/binaries/20c96b7d59f0c3cfaf7d4d712a8b7defb21d901dd53c910e27fdef15e85b0836' 2023-05-05 06:54:11,017 [lib.cuckoo.core.scheduler] INFO: Task #4: Starting analysis of FILE '/tmp/cuckoo-sflock/tmpw_v6pkh7/20c96b7d59f0c3cfaf7d.exe' 2023-05-05 06:54:11,030 [lib.cuckoo.core.scheduler] INFO: Task #4: acquired machine win10 (label=win10, arch=x64, platform=windows) 2023-05-05 06:54:11,073 [lib.cuckoo.core.resultserver] DEBUG: Task #4: The associated machine IP is 192.168.122.100 2023-05-05 06:54:11,135 [lib.cuckoo.common.abstracts] DEBUG: Starting machine win10 2023-05-05 06:54:11,136 [lib.cuckoo.common.abstracts] DEBUG: Getting status for win10 2023-05-05 06:54:11,172 [lib.cuckoo.common.abstracts] DEBUG: Using snapshot snapshot2 for virtual machine win10 2023-05-05 06:54:34,914 [lib.cuckoo.common.abstracts] DEBUG: Getting status for win10 2023-05-05 06:54:35,123 [lib.cuckoo.core.scheduler] INFO: Enabled route 'internet'. 2023-05-05 06:54:35,173 [modules.auxiliary.sniffer] ERROR: Tcpdump does not exist at path "/usr/sbin/tcpdump", network capture aborted 2023-05-05 06:54:35,174 [lib.cuckoo.core.plugins] DEBUG: Started auxiliary module: Sniffer 2023-05-05 06:54:35,192 [lib.cuckoo.core.guest] INFO: Task #4: Starting analysis on guest (id=win10, ip=192.168.122.100) 2023-05-05 06:54:35,539 [lib.cuckoo.core.guest] INFO: Task #4: Guest is running CAPE Agent 0.11 (id=win10, ip=192.168.122.100) 2023-05-05 06:54:37,853 [lib.cuckoo.core.guest] DEBUG: Task #4: Uploading analyzer to guest (id=win10, ip=192.168.122.100, size=26822422) 2023-05-05 06:54:58,210 [lib.cuckoo.core.guest] INFO: Task #4: Uploading support files to guest (id=win10, ip=192.168.122.100) 2023-05-05 06:54:58,210 [lib.cuckoo.core.guest] INFO: Task #4: Uploading script files to guest (id=win10, ip=192.168.122.100) 2023-05-05 06:55:03,514 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:04,339 [lib.cuckoo.core.resultserver] DEBUG: Task #4: live log analysis.log initialized 2023-05-05 06:55:08,579 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:10,842 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file aux/DigiSig.json 2023-05-05 06:55:10,845 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file aux/DigiSig.json of length: 116 2023-05-05 06:55:12,489 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file aux/usage.log 2023-05-05 06:55:12,489 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file aux/usage.log of length: 0 2023-05-05 06:55:13,645 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:14,731 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0001.jpg 2023-05-05 06:55:14,795 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0001.jpg of length: 167195 2023-05-05 06:55:15,936 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0002.jpg 2023-05-05 06:55:15,963 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0002.jpg of length: 168973 2023-05-05 06:55:17,036 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0003.jpg 2023-05-05 06:55:17,181 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0003.jpg of length: 154038 2023-05-05 06:55:18,420 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0004.jpg 2023-05-05 06:55:18,448 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0004.jpg of length: 169234 2023-05-05 06:55:18,699 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:23,775 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:28,820 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:33,864 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:38,927 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:44,091 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:44,611 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0005.jpg 2023-05-05 06:55:44,717 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0005.jpg of length: 169392 2023-05-05 06:55:49,148 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:54,228 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:56,939 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0006.jpg 2023-05-05 06:55:56,988 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0006.jpg of length: 121532 2023-05-05 06:55:58,115 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0007.jpg 2023-05-05 06:55:58,193 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0007.jpg of length: 116828 2023-05-05 06:55:59,272 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:55:59,777 [lib.cuckoo.core.resultserver] DEBUG: Task #4 is sending a BSON stream for pid 7044 2023-05-05 06:56:00,432 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0008.jpg 2023-05-05 06:56:00,515 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0008.jpg of length: 147436 2023-05-05 06:56:01,674 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0009.jpg 2023-05-05 06:56:01,726 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0009.jpg of length: 160631 2023-05-05 06:56:02,845 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0010.jpg 2023-05-05 06:56:02,877 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0010.jpg of length: 148913 2023-05-05 06:56:03,067 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/30d9396e4f7d59aba9a79f0827c0316612f2519442685d9e1e6ccc5c1843ad2e 2023-05-05 06:56:03,070 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/30d9396e4f7d59aba9a79f0827c0316612f2519442685d9e1e6ccc5c1843ad2e of length: 47 2023-05-05 06:56:03,690 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/de11664d22faf336380af6c0fc248edd41165cbb64fb1d02c765c8d07069bd64 2023-05-05 06:56:03,698 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/de11664d22faf336380af6c0fc248edd41165cbb64fb1d02c765c8d07069bd64 of length: 28354 2023-05-05 06:56:04,103 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0011.jpg 2023-05-05 06:56:04,159 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0011.jpg of length: 192403 2023-05-05 06:56:04,394 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:05,344 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0012.jpg 2023-05-05 06:56:05,533 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0012.jpg of length: 177074 2023-05-05 06:56:05,615 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/d969a386c474f3c765c7ea6458554c4ef60496eebd0be5a3f8c2896d7724e4d9 2023-05-05 06:56:05,618 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/d969a386c474f3c765c7ea6458554c4ef60496eebd0be5a3f8c2896d7724e4d9 of length: 139 2023-05-05 06:56:05,809 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/a82ad8d4b2e000aa2c25b071ddb73d2fcf8f3aa1bda9f81dc74300a02ea31789 2023-05-05 06:56:05,815 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/a82ad8d4b2e000aa2c25b071ddb73d2fcf8f3aa1bda9f81dc74300a02ea31789 of length: 12232 2023-05-05 06:56:06,399 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/869bdab0a7fd8975a959be403c846487981e4cda83d845e7edb592e38b5d7ea7 2023-05-05 06:56:06,405 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/869bdab0a7fd8975a959be403c846487981e4cda83d845e7edb592e38b5d7ea7 of length: 3092 2023-05-05 06:56:06,761 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0013.jpg 2023-05-05 06:56:06,821 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0013.jpg of length: 192892 2023-05-05 06:56:08,098 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0014.jpg 2023-05-05 06:56:08,166 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0014.jpg of length: 182792 2023-05-05 06:56:08,468 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/19d6c05433a446d8529639ff398674d7e1bd4c985150af16898ae05ddb8f574c 2023-05-05 06:56:08,469 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/19d6c05433a446d8529639ff398674d7e1bd4c985150af16898ae05ddb8f574c of length: 267 2023-05-05 06:56:08,980 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/b2086d437110803366a739d00b4351dbabc9b8e4e61343657fa9835dcced2c2c 2023-05-05 06:56:08,986 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/b2086d437110803366a739d00b4351dbabc9b8e4e61343657fa9835dcced2c2c of length: 9243 2023-05-05 06:56:09,335 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0015.jpg 2023-05-05 06:56:09,377 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0015.jpg of length: 195210 2023-05-05 06:56:09,445 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:09,582 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/5073b4c71f3fa49c3ba95e21f872c165dc1fdbb839b09d25046d6c00c62e1c06 2023-05-05 06:56:09,583 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/5073b4c71f3fa49c3ba95e21f872c165dc1fdbb839b09d25046d6c00c62e1c06 of length: 2572 2023-05-05 06:56:10,164 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/ffbfd78261940f0175ae30e3d987e29660353b2c3503db3f8fa5901ce4886205 2023-05-05 06:56:10,167 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/ffbfd78261940f0175ae30e3d987e29660353b2c3503db3f8fa5901ce4886205 of length: 302 2023-05-05 06:56:10,539 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0016.jpg 2023-05-05 06:56:10,600 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0016.jpg of length: 202812 2023-05-05 06:56:10,903 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/c90bfb000e0df8edee7800e7e57fe83aece0cbd011a54c75eed4aa22a02e616e 2023-05-05 06:56:10,909 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/c90bfb000e0df8edee7800e7e57fe83aece0cbd011a54c75eed4aa22a02e616e of length: 5129 2023-05-05 06:56:11,655 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0017.jpg 2023-05-05 06:56:11,711 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0017.jpg of length: 203719 2023-05-05 06:56:11,877 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/e38126f60972c67cfac44208dfc4881dead0742cf1d9f4f3377de54e3760b214 2023-05-05 06:56:11,880 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/e38126f60972c67cfac44208dfc4881dead0742cf1d9f4f3377de54e3760b214 of length: 38770 2023-05-05 06:56:11,942 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/9ee0f8cbdc5c8fbf251454f19064cf0b17c23c8996b58855b6510ff71e16963e 2023-05-05 06:56:11,948 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/9ee0f8cbdc5c8fbf251454f19064cf0b17c23c8996b58855b6510ff71e16963e of length: 1165 2023-05-05 06:56:12,961 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0018.jpg 2023-05-05 06:56:13,092 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0018.jpg of length: 192751 2023-05-05 06:56:13,515 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/ec480dd6f1be618dba816e03298691a37016cfdc060d86a5a97f8824f6935b08 2023-05-05 06:56:13,518 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/ec480dd6f1be618dba816e03298691a37016cfdc060d86a5a97f8824f6935b08 of length: 1823 2023-05-05 06:56:14,207 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0019.jpg 2023-05-05 06:56:14,312 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0019.jpg of length: 202266 2023-05-05 06:56:14,364 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/f54800a26d4b5b5e8dba0d3a35afa7678f6bb465dc73d3fce666f05de4e7ca82 2023-05-05 06:56:14,367 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/f54800a26d4b5b5e8dba0d3a35afa7678f6bb465dc73d3fce666f05de4e7ca82 of length: 559 2023-05-05 06:56:14,592 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:15,422 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0020.jpg 2023-05-05 06:56:15,520 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0020.jpg of length: 192751 2023-05-05 06:56:16,105 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/c23ef247acdfa2a865389eee031da8e7dc496f7ce84a83de4f4d35d88e7c3953 2023-05-05 06:56:16,106 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/c23ef247acdfa2a865389eee031da8e7dc496f7ce84a83de4f4d35d88e7c3953 of length: 1297 2023-05-05 06:56:16,640 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0021.jpg 2023-05-05 06:56:16,828 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0021.jpg of length: 187803 2023-05-05 06:56:17,977 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0022.jpg 2023-05-05 06:56:18,047 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0022.jpg of length: 191929 2023-05-05 06:56:19,170 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/c7f13332e1e31f29216476567a151e3944abf5ce0e8297466420d26de183f02a Screenshot from 2023-05-05 06-59-07 Screenshot from 2023-05-05 06-58-45 Screenshot from 2023-05-05 06-55-32 Screenshot from 2023-05-05 06-55-01 Screenshot from 2023-05-05 06-53-51 Screenshot from 2023-05-05 06-53-09 2023-05-05 06:56:19,177 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/c7f13332e1e31f29216476567a151e3944abf5ce0e8297466420d26de183f02a of length: 11300 2023-05-05 06:56:19,240 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0023.jpg 2023-05-05 06:56:19,267 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0023.jpg of length: 178507 2023-05-05 06:56:19,677 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:20,384 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0024.jpg 2023-05-05 06:56:20,419 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0024.jpg of length: 184976 2023-05-05 06:56:24,722 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:29,389 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0025.jpg 2023-05-05 06:56:29,450 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0025.jpg of length: 184822 2023-05-05 06:56:29,770 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:34,911 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:40,044 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:45,095 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:50,170 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:51,083 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/ea9e39764862ee27d72e9aee10e8bbc48f80c882114fd2be4e877625dece22dd 2023-05-05 06:56:51,085 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/ea9e39764862ee27d72e9aee10e8bbc48f80c882114fd2be4e877625dece22dd of length: 1993 2023-05-05 06:56:51,730 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0026.jpg 2023-05-05 06:56:51,743 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0026.jpg of length: 187826 2023-05-05 06:56:52,795 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0027.jpg 2023-05-05 06:56:52,849 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0027.jpg of length: 188372 2023-05-05 06:56:52,907 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/0d93a6db6d07f31e4fc6ad66b5ca73e60539c7cb0587a0b57150470476d326d8 2023-05-05 06:56:52,911 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/0d93a6db6d07f31e4fc6ad66b5ca73e60539c7cb0587a0b57150470476d326d8 of length: 5296 2023-05-05 06:56:53,321 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/218af67f72359cd72626dd67bd8decfd64bdc700219872ea8a367e9adec4fda7 2023-05-05 06:56:53,323 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/218af67f72359cd72626dd67bd8decfd64bdc700219872ea8a367e9adec4fda7 of length: 4449 2023-05-05 06:56:53,924 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0028.jpg 2023-05-05 06:56:53,998 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0028.jpg of length: 188336 2023-05-05 06:56:54,950 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/505beb291d3e7ba256422c3eaed348a490ed262cc99cf00f975c8fd6cad91c14 2023-05-05 06:56:54,959 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/505beb291d3e7ba256422c3eaed348a490ed262cc99cf00f975c8fd6cad91c14 of length: 648 2023-05-05 06:56:55,073 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0029.jpg 2023-05-05 06:56:55,100 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0029.jpg of length: 190564 2023-05-05 06:56:55,215 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:56:55,286 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/1708d7719d5711f84b722a475ae642d1d4a4a55fee6c7ab928b8a1391eaaedf8 2023-05-05 06:56:55,292 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/1708d7719d5711f84b722a475ae642d1d4a4a55fee6c7ab928b8a1391eaaedf8 of length: 79707 2023-05-05 06:56:55,458 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/c018e4e43582fc9ead01ef3e4d0f29407d33d6c111d74d605179c29c379dc5ac 2023-05-05 06:56:55,460 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/c018e4e43582fc9ead01ef3e4d0f29407d33d6c111d74d605179c29c379dc5ac of length: 16478 2023-05-05 06:56:55,606 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/3f521a3a874e45846610b68640d2a719736c476a6d69d4f0adc1cc58e964d7c6 2023-05-05 06:56:55,608 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/3f521a3a874e45846610b68640d2a719736c476a6d69d4f0adc1cc58e964d7c6 of length: 16 2023-05-05 06:56:55,742 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/015ee8e68c5502891441c1747d5527af856573aa3304f475ca2d2bcad54d1ab2 2023-05-05 06:56:55,743 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/015ee8e68c5502891441c1747d5527af856573aa3304f475ca2d2bcad54d1ab2 of length: 168 2023-05-05 06:56:56,253 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0030.jpg 2023-05-05 06:56:56,284 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0030.jpg of length: 186902 2023-05-05 06:56:56,800 [lib.cuckoo.core.resultserver] DEBUG: Task #4 is sending a BSON stream for pid 3972 2023-05-05 06:56:57,451 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0031.jpg 2023-05-05 06:56:57,672 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0031.jpg of length: 163202 2023-05-05 06:56:57,681 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/7f2a395db9dcc38b2941e464eafb56e5529d39571be2b01e3094f15f58e6e276 2023-05-05 06:56:57,687 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/7f2a395db9dcc38b2941e464eafb56e5529d39571be2b01e3094f15f58e6e276 of length: 300 2023-05-05 06:56:57,691 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file procdump/c5ab602bbbb59d81477c09ce41fd4c1cbed204ceb6d62d42894e1779e39634b8 2023-05-05 06:56:57,710 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file procdump/c5ab602bbbb59d81477c09ce41fd4c1cbed204ceb6d62d42894e1779e39634b8 of length: 70656 2023-05-05 06:56:58,418 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/b175ae7d56f7603078af5bbaca562537e8d46af623974e44f8b0d00b66aaa2ac 2023-05-05 06:56:58,420 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/b175ae7d56f7603078af5bbaca562537e8d46af623974e44f8b0d00b66aaa2ac of length: 300 2023-05-05 06:56:58,420 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/74b2935b9dfe4ba397fd0507ae3c36abb77193041f2e7c43c55dc4e7b33de61c 2023-05-05 06:56:58,429 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/74b2935b9dfe4ba397fd0507ae3c36abb77193041f2e7c43c55dc4e7b33de61c of length: 57344 2023-05-05 06:56:58,777 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0032.jpg 2023-05-05 06:56:58,799 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0032.jpg of length: 188758 2023-05-05 06:56:58,997 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/8038aef40e3f8eaf89a1cf7666448452249638645fec5c2a386cbb3b2fdbf7c9 2023-05-05 06:56:59,011 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/8038aef40e3f8eaf89a1cf7666448452249638645fec5c2a386cbb3b2fdbf7c9 of length: 566784 2023-05-05 06:56:59,056 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/0487dda344642d3b6005ad176dc275a3da765b1658348e949f326236360739e7 2023-05-05 06:56:59,057 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/0487dda344642d3b6005ad176dc275a3da765b1658348e949f326236360739e7 of length: 300 2023-05-05 06:56:59,214 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/cde33a5884984ad35db25a6fae524cf8c64751cc175bf671a7c16715400e97e9 2023-05-05 06:56:59,272 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/cde33a5884984ad35db25a6fae524cf8c64751cc175bf671a7c16715400e97e9 of length: 250880 2023-05-05 06:56:59,324 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/98d0cd9310f89a044bf50e1315aa59da8eb6c51765d9b7d4b331d26166a351f3 2023-05-05 06:56:59,326 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/98d0cd9310f89a044bf50e1315aa59da8eb6c51765d9b7d4b331d26166a351f3 of length: 300 2023-05-05 06:56:59,470 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/3fcc497e73cb59063a11496c79f23c8b7cfd6c3e44e1ba0cf8a360343520ae16 2023-05-05 06:56:59,472 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/3fcc497e73cb59063a11496c79f23c8b7cfd6c3e44e1ba0cf8a360343520ae16 of length: 300 2023-05-05 06:56:59,781 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/e11320ffabe59a2595bfd2e9834aa346511c1b260d97e55c2b35728ed9e3f326 2023-05-05 06:56:59,784 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/e11320ffabe59a2595bfd2e9834aa346511c1b260d97e55c2b35728ed9e3f326 of length: 300 2023-05-05 06:56:59,905 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0033.jpg 2023-05-05 06:57:00,020 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0033.jpg of length: 174988 2023-05-05 06:57:00,281 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/a5c3d14cea6657dbddd5dcd5dd423a0ae034ed0ad105f88c8fd134daf88063a8 2023-05-05 06:57:00,287 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/a5c3d14cea6657dbddd5dcd5dd423a0ae034ed0ad105f88c8fd134daf88063a8 of length: 300 2023-05-05 06:57:00,299 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:57:00,840 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/f3d2e2f5e9d5513b8a665e985c666e22016c13aea4dbadc93de7918fe4ff4b11 2023-05-05 06:57:00,846 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/f3d2e2f5e9d5513b8a665e985c666e22016c13aea4dbadc93de7918fe4ff4b11 of length: 300 2023-05-05 06:57:01,101 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0034.jpg 2023-05-05 06:57:01,133 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0034.jpg of length: 183543 2023-05-05 06:57:01,441 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file CAPE/a7d18d27201fe2caff1da2d756902b128d7dadff6da464be0d7398316938d4af 2023-05-05 06:57:01,442 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file CAPE/a7d18d27201fe2caff1da2d756902b128d7dadff6da464be0d7398316938d4af of length: 300 2023-05-05 06:57:02,248 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0035.jpg 2023-05-05 06:57:02,321 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0035.jpg of length: 182300 2023-05-05 06:57:03,387 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0036.jpg 2023-05-05 06:57:03,455 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0036.jpg of length: 187408 2023-05-05 06:57:04,584 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0037.jpg 2023-05-05 06:57:04,671 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0037.jpg of length: 182674 2023-05-05 06:57:05,343 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:57:05,467 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file files/6b86b273ff34fce19d6b804eff5a3f5747ada4eaa22f1d49c01e52ddb7875b4b 2023-05-05 06:57:05,470 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file files/6b86b273ff34fce19d6b804eff5a3f5747ada4eaa22f1d49c01e52ddb7875b4b of length: 1 2023-05-05 06:57:05,767 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0038.jpg 2023-05-05 06:57:05,883 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0038.jpg of length: 193406 2023-05-05 06:57:07,015 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0039.jpg 2023-05-05 06:57:07,089 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0039.jpg of length: 188370 2023-05-05 06:57:08,512 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0040.jpg 2023-05-05 06:57:08,591 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0040.jpg of length: 182060 2023-05-05 06:57:10,519 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:57:12,221 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0041.jpg 2023-05-05 06:57:12,579 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0041.jpg of length: 189117 2023-05-05 06:57:13,682 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0042.jpg 2023-05-05 06:57:13,766 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0042.jpg of length: 190157 2023-05-05 06:57:14,906 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0043.jpg 2023-05-05 06:57:14,934 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0043.jpg of length: 191497 2023-05-05 06:57:15,613 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:57:15,996 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0044.jpg 2023-05-05 06:57:16,040 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0044.jpg of length: 192183 2023-05-05 06:57:20,583 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0045.jpg 2023-05-05 06:57:20,639 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0045.jpg of length: 190438 2023-05-05 06:57:20,686 [lib.cuckoo.core.guest] DEBUG: Task #4: Analysis is still running (id=win10, ip=192.168.122.100) 2023-05-05 06:57:21,711 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file shots/0046.jpg 2023-05-05 06:57:21,767 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file shots/0046.jpg of length: 181533 2023-05-05 06:57:22,099 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file files/7a40d47d3a4d89db89e8d307d03ed81598ecde55eda40ac09053beba6708c62c 2023-05-05 06:57:22,100 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file files/7a40d47d3a4d89db89e8d307d03ed81598ecde55eda40ac09053beba6708c62c of length: 755 2023-05-05 06:57:22,152 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Trying to upload file files/20c96b7d59f0c3cfaf7d4d712a8b7defb21d901dd53c910e27fdef15e85b0836 2023-05-05 06:57:22,209 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Uploaded file files/20c96b7d59f0c3cfaf7d4d712a8b7defb21d901dd53c910e27fdef15e85b0836 of length: 768512 2023-05-05 06:57:22,711 [lib.cuckoo.core.guest] INFO: Task #4: Analysis completed successfully (id=win10, ip=192.168.122.100) 2023-05-05 06:57:22,718 [lib.cuckoo.core.plugins] DEBUG: Stopped auxiliary module: Sniffer 2023-05-05 06:57:22,719 [lib.cuckoo.common.abstracts] DEBUG: Stopping machine win10 2023-05-05 06:57:22,719 [lib.cuckoo.common.abstracts] DEBUG: Getting status for win10 2023-05-05 06:57:25,455 [lib.cuckoo.common.abstracts] DEBUG: Getting status for win10 2023-05-05 06:57:25,467 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Stopped tracking machine 192.168.122.100 2023-05-05 06:57:25,467 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Cancel <Context for b'BSON'> 2023-05-05 06:57:25,467 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Cancel <Context for b'LOG'> 2023-05-05 06:57:25,467 [lib.cuckoo.core.resultserver] DEBUG: Task #4: Cancel <Context for b'BSON'> 2023-05-05 06:57:25,581 [lib.cuckoo.core.scheduler] INFO: Disabled route 'internet' 2023-05-05 06:57:25,605 [lib.cuckoo.core.scheduler] DEBUG: Task #4: Released database task with status True 2023-05-05 06:57:25,682 [lib.cuckoo.core.scheduler] INFO: Task #4: analysis procedure completed


CatFoxVoyager commented 1 year ago

I have the same behavior thos morning after a git pull

ClaudioWayne commented 1 year ago

Hi Guys,

try the following config settings:

in web.conf: existent_tasks = no

in reporting.conf [mongodb]
enabled = yes

CatFoxVoyager commented 1 year ago

@ClaudioWayne thanks, it worked!

doomedraven commented 1 year ago

also do git pull

musman12362 commented 1 year ago

Hi Guys,

try the following config settings:

in web.conf: existent_tasks = no

in reporting.conf [mongodb] enabled = yes

Thank you it worked for me

musman12362 commented 1 year ago

Screenshot from 2023-05-08 05-04-23

db703fad83de57168d5cfb5320fa14e4eb7a0e5243884ebac061e02e52c00d48.zip

when i submit this file i got this result

i'm facing this issue with this files while many other files are working properly

doomedraven commented 1 year ago

that is an ELF file aka linux, for that you need to have enabled linux in web.conf

musman12362 commented 1 year ago

that is an ELF file aka linux, for that you need to have enabled linux in web.conf

Thank you for your guidance

doomedraven commented 1 year ago

yaw, but for those samples you need linux VMs, and linux analysis is pretty bad, there is no behavior, is just to see to see network traffic, and is not developed/maintained by core devs.

yaoplusplus commented 1 year ago

thanks a lot, I've stuck in this trap for nearly a week.

musman12362 commented 1 year ago

I'm trying to run cleaners.py but its not running and showing error please help me how i can resolve this error . I'm attaching screenshot just to show commands that is used to run cleaners.py. Screenshot from 2023-05-18 05-15-04

doomedraven commented 1 year ago

Use cape properly as cape user, with poetry and thats all

El jue, 18 may 2023 7:33, musman12362 @.***> escribió:

I'm trying to run cleaners.py but its not running and showing error please help me how i can resolve this error . I'm attaching screenshot just to show commands that is used to run cleaners.py. [image: Screenshot from 2023-05-18 05-15-04] https://user-images.githubusercontent.com/113498180/239147653-fc643343-a3f0-4940-96b5-d2a15b1362fc.png

here are my configuration files

  1. cuckoo.conf [cuckoo]

Which category of tasks do you want to analyze?

categories = static, pcap, url, file If turned on, Cuckoo will delete the original file after its analysis has been completed.

delete_original = off Archives are not deleted by default, as it extracts and "original file" become extracted file

delete_archive = on If turned on, Cuckoo will delete the copy of the original file in the local binaries repository after the analysis has finished. (On *nix this will also invalidate the file called "binary" in each analysis directory, as this is a symlink.)

delete_bin_copy = off Specify the name of the machinery module to use, this module will define the interaction between Cuckoo and your virtualization software of choice.

machinery = kvm Enable creation of memory dump of the analysis machine before shutting down. Even if turned off, this functionality can also be enabled at submission. Currently available for: VirtualBox and libvirt modules (KVM).

memory_dump = off When the timeout of an analysis is hit, the VM is just killed by default. For some long-running setups it might be interesting to terminate the moinitored processes before killing the VM so that connections are closed.

terminate_processes = off Enable automatically re-schedule of "broken" tasks each startup. Each task found in status "processing" is re-queued for analysis.

reschedule = off Fail "unserviceable" tasks as they are queued. Any task found that will never be analyzed based on the available analysis machines will have its status set to "failed".

fail_unserviceable = on Limit the amount of analysis jobs a Cuckoo process goes through. This can be used together with a watchdog to mitigate risk of memory leaks.

max_analysis_count = 10 Limit the number of concurrently executing analysis machines. This may be useful on systems with limited resources. Set to 0 to disable any limits.

max_machines_count = 5 Limit the amount of VMs that are allowed to start in parallel. Generally speaking starting the VMs is one of the more CPU intensive parts of the actual analysis. This option tries to avoid maxing out the CPU completely.

max_vmstartup_count = 5 Minimum amount of free space (in MB) available before starting a new task. This tries to avoid failing an analysis because the reports can't be written due out-of-diskspace errors. Setting this value to 0 disables the check. (Note: this feature is currently not supported under Windows.)

freespace = 50000 Process tasks, but not reach out of memory

freespace_processing = 15000 Temporary directory containing the files uploaded through Cuckoo interfaces (web.py, api.py, Django web interface).

tmppath = /tmp Delta in days from current time to set the guest clocks to for file analyses A negative value sets the clock back, a positive value sets it forward. The default of 0 disables this option Note that this can still be overridden by the per-analysis clock setting and it is not performed by default for URL analysis as it will generally result in SSL errors

daydelta = 0 Path to the unix socket for running root commands.

rooter = /tmp/cuckoo-rooter Enable if you want to see a DEBUG log periodically containing backlog of pending tasks, locked vs unlocked machines. NOTE: Enabling this feature adds 4 database calls every 10 seconds.

periodic_log = off Max filename length for submissions, before truncation. 196 is arbitrary.

max_len = 196 If it is greater than this, call truncate the filename further for sanitizing purposes. Length truncated to is controlled by sanitize_to_len. This is to prevent long filenames such as files named by hash.

sanitize_len = 32 sanitize_to_len = 24

[resultserver] The Result Server is used to receive in real time the behavioral logs produced by the analyzer. Specify the IP address of the host. The analysis machines should be able to contact the host through such address, so make sure it's valid. NOTE: if you set resultserver IP to 0.0.0.0 you have to set the option resultserver_ip for all your virtual machines in machinery configuration.

ip = 192.168.10.129 Specify a port number to bind the result server on.

port = 2042 Force the port chosen above, don't try another one (we can select another port dynamically if we can not bind this one, but that is not an option in some setups)

force_port = yes

pool_size = 0 Should the server write the legacy CSV format? (if you have any custom processing on those, switch this on)

store_csvs = off Maximum size of uploaded files from VM (screenshots, dropped files, log) The value is expressed in bytes, by default 100MB.

upload_max_size = 100000000 To enable trimming of huge binaries go to -> web.conf -> general -> enable_trim Prevent upload of files that passes upload_max_size?

do_upload_max_size = no

[processing] Set the maximum size of analyses generated files to process. This is used to avoid the processing of big files which may take a lot of processing time. The value is expressed in bytes, by default 200MB.

analysis_size_limit = 200000000 Enable or disable DNS lookups.

resolve_dns = on Enable or disable reverse DNS lookups This information currently is not displayed in the web interface

reverse_dns = off Enable PCAP sorting, needed for the connection content view in the web interface.

sort_pcap = on

[database] Specify the database connection string. Examples, see documentation for more: sqlite:///foo.db @.:5432/mydatabase @./mydatabase If empty, default is a SQLite in db/cuckoo.db. SQLite doens't support database upgrades! For production we strongly suggest go with PostgreSQL

connection = @.***:5432/cape Database connection timeout in seconds. If empty, default is set to 60 seconds.

timeout =

[timeouts] Set the default analysis timeout expressed in seconds. This value will be used to define after how many seconds the analysis will terminate unless otherwise specified at submission.

default = 200 Set the critical timeout expressed in (relative!) seconds. It will be added to the default timeout above and after this timeout is hit Cuckoo will consider the analysis failed and it will shutdown the machine no matter what. When this happens the analysis results will most likely be lost.

critical = 60 Maximum time to wait for virtual machine status change. For example when shutting down a vm. Default is 300 seconds.

vm_state = 300

[tmpfs] only if you using volatility to speedup IO mkdir -p /mnt/tmpfs mount -t tmpfs -o size=50g ramfs /mnt/tmpfs chown cape:cape /mnt/tmpfs vim /etc/fstab tmpfs /mnt/tmpfs tmpfs nodev,nosuid,noexec,nodiratime,size=50g 0 0 Add crontab with @reboot https://github.com/reboot chown cape:cape /mnt/tmpfs -R

enabled = off path = /mnt/tmpfs/ in mb

freespace = 2000

  1. auxiliary.conf

Requires dependencies of software in vm as by: https://www.fireeye.com/blog/threat-research/2016/02/greater_visibilityt.html Windows 7 SP1, .NET at least 4.5, powershell 5 preferly over v4 KB3109118 - Script block logging back port update for WMF4 x64 - https://cuckoo.sh/vmcloak/Windows6.1-KB3109118-v4-x64.msu x32 - https://cuckoo.sh/vmcloak/Windows6.1-KB3109118-v4-x86.msu KB2819745 - WMF 4 (Windows Management Framework version 4) update for Windows 7 x64 - https://cuckoo.sh/vmcloak/Windows6.1-KB2819745-x64-MultiPkg.msu x32 - https://cuckoo.sh/vmcloak/Windows6.1-KB2819745-x86-MultiPkg.msu KB3191566

  • https://www.microsoft.com/en-us/download/details.aspx?id=54616 You should create following registry entries reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\ModuleLogging\ModuleNames" /v /t REG_SZ /d /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\ScriptBlockLogging" /v EnableScriptBlockLogging /t REG_DWORD /d 00000001 /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\Transcription" /v EnableTranscripting /t REG_DWORD /d 00000001 /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\Transcription" /v OutputDirectory /t REG_SZ /d C:\PSTranscipts /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\Transcription" /v EnableInvocationHeader /t REG_DWORD /d 00000001 /f /reg:64 Modules to be enabled or not inside of the VM

[auxiliary_modules] browser = yes curtain = no digisig = yes disguise = yes evtx = no human_windows = yes human_linux = no procmon = no screenshots_windows = yes screenshots_linux = yes sysmon = no tlsdump = yes usage = no file_pickup = no permissions = no pre_script = no during_script = no stap = no filecollector = yes This is only useful in case you use KVM's dnsmasq. You need to change your range inside of analyzer/windows/modules/auxiliary/disguise.py. Disguise must be enabled

windows_static_route = no

[sniffer] Enable or disable the use of an external sniffer (tcpdump) [yes/no].

enabled = yes enable remote tcpdump support

remote = no host = @.*** Specify the path to your local installation of tcpdump. Make sure this path is correct.

tcpdump = /usr/bin/tcpdump Specify the network interface name on which tcpdump should monitor the traffic. Make sure the interface is active.

interface = ens33

virbr1

Specify a Berkeley packet filter to pass to tcpdump.

bpf = not arp

[gateways]

RTR1 = 192.168.1.254

RTR2 = 192.168.1.1

INETSIM = 192.168.1.2

[virustotaldl] adds an option in the web interface to upload samples via VirusTotal downloads for a comma-separated list of MD5/SHA1/SHA256 hashes

enabled = no note that unlike the VirusTotal processing module, the key required here is a Intelligence API key, not a Public API key

dlintelkey = SomeKeyWithDLAccess

dlpath = /tmp/

  1. processing.conf Enable or disable the available processing modules [on/off].

If you add a custom processing module to your Cuckoo setup, you have to add a dedicated entry in this file, or it won't be executed. You can also add additional options under the section of your module and they will be available in your Python class. exclude files that doesn't match safe extension and ignore their files from processing inside of other modules like CAPE.py

[antiransomware] enabled = no ignore all files with extension found more than X

skip_number = 30

[curtain] enabled = no

[sysmon] enabled = no

[analysisinfo] enabled = yes FLARE capa -> to update rules utils/community.py -cr install -> cd /tmp && git clone --recurse-submodules https://github.com/fireeye/capa.git && cd capa && git submodule update --init rules && python -m pip3 install .

[flare_capa] enabled = no Generate it always or generate on demand only(user need to click button to generate it), still should be enabled to use this feature on demand

on_demand = no Analyze binary payloads

static = no Analyze CAPE payloads

cape = no Analyze ProcDump

procdump = no

[decompression] enabled = no

[dumptls] enabled = no

[behavior] enabled = yes Toggle specific modules within the BehaviorAnalysis class

anomaly = yes processtree = yes summary = yes enhanced = yes encryptedbuffers = yes Should the server use a compressed version of behavioural logs? This helps in saving space in Mongo, accelerates searchs and reduce the size of the final JSON report.

loop_detection = no The number of calls per process to process. 0 switches the limit off. 10000 api calls should be processed in less than 2 minutes

analysis_call_limit = 0 Use ram to boost processing speed. You will need more than 20GB of RAM for this feature. Please read "performance" section in the documentation.

ram_boost = no https://capev2.readthedocs.io/en/latest/usage/patterns_replacement.html

replace_patterns = no

[debug] enabled = yes

[detections] enabled = yes Signatures

behavior = yes yara = yes suricata = yes virustotal = yes clamav = no ... but this mechanism may still be switched on

[procmemory] enabled = yes strings = yes

[procmon] enabled = no

[memory] enabled = no

[usage] enabled = no

[network] enabled = yes sort_pcap = no DNS whitelisting to ignore domains/IPs configured in network.py This should be disabled when utilizing InetSim/Remnux as we end up resolving the IP from fakedns which would then remove all domains associated with that resolved IP

dnswhitelist = yes additional entries

dnswhitelist_file = extra/whitelist_domains.txt ipwhitelist = yes ipwhitelist_file = extra/whitelist_ips.txt Requires geoip2 and maxmind database

country_lookup = no Register and download for free from https://www.maxmind.com/

maxmind_database = data/GeoLite2-Country.mmdb

[url_analysis] enabled = yes Enable a WHOIS lookup for the target domain of a URL analyses

whois = yes

[strings] enabled = yes on_demand = no nullterminated_only = no minchars = 5

[trid] Specify the path to the trid binary to use for static analysis.

enabled = no identifier = data/trid/trid definitions = data/trid/triddefs.trd

[die] Detect it Easy

enabled = no binary = /usr/bin/diec

[virustotal] enabled = yes on_demand = no timeout = 60 remove empty detections

remove_empty = yes Add your VirusTotal API key here. The default API key, kindly provided by the VirusTotal team, should enable you with a sufficient throughput and while being shared with all our users, it shouldn't affect your use.

key = abc61e5746de30f16bbff4d3c8e4ea177b6f6be0da14b66e02543786a6fd9ad4 do_file_lookup = yes do_url_lookup = yes urlscrub = (^http://serw.clicksor.com/redir.php?url=|&InjectedParam=.+$ http://serw.clicksor.com/redir.php?url=%7C&InjectedParam=.+$)

[suricata] Notes on getting this to work check install_suricata function: https://github.com/doomedraven/Tools/blob/master/Sandbox/cape2.sh

enabled = yes

Runmode "cli" or "socket"

runmode = socket

Outputfiles

if evelog is specified, it will be used instead of the per-protocol log files

evelog = eve.json per-protocol log files

alertlog = alert.json

httplog = http.json

tlslog = tls.json

sshlog = ssh.json

dnslog = dns.json

fileslog = files-json.log filesdir = files Amount of text to carve from plaintext files (bytes)

buffer = 8192

Used for creating an archive of extracted files

7zbin = /usr/bin/7z zippass = infected

Runmode "cli" options

bin = /usr/bin/suricata conf = /etc/suricata/suricata.yaml

Runmode "socket" Options

socket_file = /tmp/suricata-command.socket

[cif] enabled = no url of CIF server

url = https://your-cif-server.com/api CIF API key

key = your-api-key-here time to wait for server to respond, in seconds

timeout = 60 minimum confidence level of returned results: 25=not confident, 50=automated, 75=somewhat confident, 85=very confident, 95=certain defaults to 85

confidence = 85 don't log queries by default, set to 'no' to log queries

nolog = yes max number of results per query

per_lookup_limit = 20 max number of queries per analysis

per_analysis_limit = 200

[CAPE] enabled = yes Ex targetinfo standalone module

targetinfo = yes Ex dropped standalone module

dropped = yes Ex procdump standalone module

procdump = yes Amount of text to carve from plaintext files (bytes)

buffer = 8192 Process files not bigger than value below in Mb. We saw that after 90Mb it has biggest delay

max_file_size = 90 Scan for UserDB.TXT signature matches

userdb_signature = no https://capev2.readthedocs.io/en/latest/usage/patterns_replacement.html

replace_patterns = no Deduplicate screenshots You need to install dependency ImageHash>=4.2.1

[deduplication] Available hashs functions: ahash: Average hash phash: Perceptual hash dhash: Difference hash whash-haar: Haar wavelet hash whash-db4: Daubechies wavelet hash

enabled = no hashmethod = ahash

[vba2graph] Mac - brew install graphviz Ubuntu - sudo apt-get install graphviz Arch - sudo pacman -S graphviz+ sudo pip3 install networkx>=2.1 graphviz>=0.8.4 pydot>=1.2.4

enabled = yes on_demand = yes ja3 finger print db with descriptions https://github.com/trisulnsm/trisul-scripts/blob/master/lua/frontend_scripts/reassembly/ja3/prints/ja3fingerprint.json

[ja3] ja3_path = data/ja3/ja3fingerprint.json

[maliciousmacrobot] https://maliciousmacrobot.readthedocs.io Install mmbot sudo pip3 install mmbot Create/Set required paths Populate benign_path and malicious_path with appropriate macro maldocs (try the tests/samples in the github) https://github.com/egaus/MaliciousMacroBot/tree/master/tests/samples Create modeldata.pickle with your maldocs (this does not append to the model, it overwrites it) mmb = MaliciousMacroBot(benign_path, malicious_path, model_path, retain_sample_contents=False) result = mmb.mmb_init_model(modelRebuild=True) Copy your model file and vocab.txt to your model_path

enabled = no benign_path = /opt/cuckoo/data/mmbot/benign malicious_path = /opt/cuckoo/data/mmbot/malicious model_path = /opt/cuckoo/data/mmbot/model

[xlsdeobf] pip3 install git+ https://github.com/DissectMalware/XLMMacroDeobfuscator.git

enabled = no on_demand = no

[boxjs] enabled = no timeout = 60 url = http://your_super_box_js:9000 Extractors

[mwcp] enabled = yes modules_path = modules/processing/parsers/mwcp/

[ratdecoders] enabled = yes modules_path = modules/processing/parsers/RATDecoders/

[malduck] enabled = yes modules_path = modules/processing/parsers/malduck/

[CAPE_extractors] enabled = yes Must ends with /

modules_path = modules/processing/parsers/CAPE/

[reversinglabs] enabled = no url = key =

[script_log_processing] enabled = yes Dump PE's overlay info

[overlay] enabled = no

[floss] enabled = no on_demand = yes static_strings = no stack_strings = yes decoded_strings = yes tight_strings = yes min_length = 5 Download FLOSS signatures from https://github.com/mandiant/flare-floss/tree/master/sigs

sigs_path = data/flare-signatures

  1. reporting.conf

Enable or disable the available reporting modules [on/off]. If you add a custom reporting module to your Cuckoo setup, you have to add a dedicated entry in this file, or it won't be executed. You can also add additional options under the section of your module and they will be available in your Python class.

[cents] enabled = no on_demand = no starting signature id for created Suricata rules

start_sid = 1000000

[mitre] enabled = no https://github.com/geekscrapy/binGraph requires -> apt-get install python-tk

[bingraph] enabled = yes on_demand = yes binary = yes geenrate bingraphs for cape/procdumps

cape = yes procdump = yes

[pcap2cert] enabled = yes

[litereport] enabled = no keys_to_copy = CAPE procdump info signatures dropped static target network shot malscore ttps behavior_keys_to_copy = processtree summary

[reportbackup] enabled = no External service to use

googledrive = no Specify the ID of the shared Google Drive Folder where reports will be backed up to Replace folder ID with own Google Drive shared folder (share access to created service account) Without service account, upload process cannot complete due to browser not being able to launch

drive_folder_id = id_here drive_credentials_location = data/google_creds.json

[jsondump] enabled = yes indent = 4 encoding = latin-1

[reporthtml] Standalone report, not requires CAPE webgui

enabled = no Include screenshots in report

screenshots = no apicalls = no

[reporthtmlsummary] much smaller, faster report generation, omits API logs and is non-interactive

enabled = no Include screenshots in report

screenshots = no

[reportpdf] Note that this requires reporthtmlsummary to be enabled above as well

enabled = no

[maec41] enabled = no mode = overview processtree = true output_handles = false static = true strings = true virustotal = true deduplicate = true

[maec5] enabled = no

[mongodb] enabled = yes host = 127.0.0.1 port = 27017 db = cuckoo Set those values if you are using mongodb authentication username = password = authsource = cuckoo Set this value if you are using mongodb with TLS enabled tlscafile = Automatically delete large dict values that exceed mongos 16MB limitation Note: This only deletes dict keys from data stored in MongoDB. You would still get the full dataset if you parsed the results dict in another reporting module or from the jsondump module.

fix_large_docs = yes ES is not officially supported by core dev and relays on community Latest known working version is 7.16.2 Use ElasticSearch as the "database" which powers Django. NOTE: If this is enabled, MongoDB should not be enabled, unless search only option is set to yes. Then elastic search is only used for /search web page.

[elasticsearchdb] enabled = no searchonly = no host = 127.0.0.1 port = 9200 The report data is indexed in the form of {{index-yyyy.mm.dd}} so the below index configuration option is actually an index 'prefix'.

index = cuckoo username = password = use_ssl = verify_certs =

[retention] enabled = no run at most once every this many hours (unless reporting.conf is modified)

run_every = 6 The amount of days old a task needs to be before deleting data Set a value to no to never delete it

memory = 14 procmemory = 62 pcap = 62 sortedpcap = 14 bsonlogs = 62 dropped = 62 screencaps = 62 reports = 62 mongo = 731 elastic = no

[syslog] enabled = no IP of your syslog server/listener

host = x.x.x.x Port of your syslog server/listener

port = 514 Protocol to send data over

protocol = tcp Store a logfile? [in reports directory]

logfile = yes if yes, what logname? [Default: syslog.txt]

logname = syslog.log

[moloch] enabled = no base = https://172.18.100.105:8005/ node = cuckoo3 capture = /data/moloch/bin/moloch-capture captureconf = /data/moloch/etc/config.ini user = admin pass = admin realm = Moloch

[resubmitexe] enabled = no resublimit = 5

[compression] enabled = no zipmemdump = yes zipmemstrings = yes zipprocdump = yes zipprocstrings = yes

[misp] enabled = no apikey = url =

Make event published after creation?

published = no minimal malscore, by default all

min_malscore = 0 by default 5 threads

threads = this will retrieve information for iocs and activate misp report download from webgui

extend_context = no upload iocs from cuckoo to MISP

upload_iocs = no distribution = 0 threat_level_id = 2 analysis = 2 Sections to report Analysis ID will be appended, change

title = Iocs from cuckoo analysis: network = no ids_files = no dropped = no registry = no mutexes = no

[callback] enabled = no will send as post data {"task_id":X} can be coma separated urls

url = http://IP/callback Compress results including CAPE output to help avoid reaching the hard 16MB MongoDB limit.

[compressresults] enabled = no

[tmpfsclean] enabled = no key = tr_extractor This calls the specified command, pointing it at the report.json as well as setting $ENV{CAPE_TASK_ID} to the task ID of the run in question.

[zexecreport] enabled=no command=/foo/bar.pl run statistics, this may take more times.

[runstatistics] enabled = no

[malheur] enabled = no

  1. web.conf

Enable Django authentication/signup for website

[web_auth] enabled = no You will also need to add django admin to make it working by running: poetry run python manage.py createsuperuser ReCaptcha protected admin login

captcha = no 2fa = no To enable Oauth check https://django-allauth.readthedocs.io and web/web/settings.py.

[registration] enabled = no manual_approve = yes email_required = yes email_confirmation = yes email_prefix_subject = "[CAPE Sandbox]" email_host = "" email_user = "" email_password = "" email_port = 465 use_ssl = 0 use_tls = 0 captcha_enabled = no Do you want to ban temporal email services?

disposable_email_disable = yes disposable_domain_list = data/safelist/disposable_domain_list.txt

[general] max_sample_size = 30000000 Try to trim huge binaries that bigger than max_sample_size or enable allow_ingore_size and specify that option

enable_trim = no Required to be enabled and option set to ignore_size_check=1

allow_ignore_size = no Number of results to show on webgui on search action Intermediate solution, the ideal solution is pagination with cursor .skip(X).limit(Y)

search_limit = 50 Allow anon users to browser site but not submit/download

anon_viewable = no existent_tasks = yes top_detections = yes hostname of the cape instance

hostname = https://127.0.0.1/ ;hostname = https://www.capesandbox.com/ Check if config exists or try to extract before accept task as static

check_config_exists = no Assign architecture to task to fetch correct VM type

dynamic_arch_determination = yes Assign platform to task to fetch correct VM type

dynamic_platform_determination = yes Allow to download reports only to specific users, need to be activated in user profile, select checkbox near to "Reports" and set to "no" here

reports_dl_allowed_to_all = yes Expose process log per task if enabled

expose_process_log = no ratelimit for anon users

[ratelimit] enabled = no rps = 1/rps rpm = 5/rpm Show submit to all VMs on webgui

[all_vms] enabled = no

[admin] enabled = no

[comments] enabled = no

enable linux fields on webgui

[linux]

For advanced users only, can be buggy, linux analysis is work in progress

for fun enabled = no

[malscore] enabled = yes

[vtupload] Don't forget to set VT key in aux.conf under virustotaldl

enabled = no

No means delete is disabled on webgui

[delete] enabled = no

Dl'n'Exec analysis tab on submission

[dlnexec] enabled = no

url analysis tab on submission

[url_analysis] enabled = yes

TLP markings on submission and webgui

[tlp] enabled = no

AMSI dump submission checkbox: can be useful to disable if no Win10+

instances

(amsidump is enabled by default in the monitor for Win10+)

[amsidump] enabled = yes Limitation for public instances, api has no limits

[public] enabled = no priority = 1 timeout = 300 Disable duplicated submissions for X hours

[uniq_submission] enabled = no hours = 24 All providers can be found here https://django-allauth.readthedocs.io/en/latest/providers.html

[oauth] amazon = no github = no gitlab = no twitter = no

[display_browser_martians] enabled = no

[display_office_martians] enabled = no

[display_shrike] enabled = no

[display_task_tags] displays custom tags, if set during sample submission

enabled = no

[expanded_dashboard] displays package, custom field, malfamily, clamav, PCAP link, and extended suricata results

enabled = no

[display_et_portal] enabled = no

[display_pt_portal] enabled = no

[zipped_download] enabled = yes zip_pwd = infected Allow to download all Dropped/Procdump/etc

download_all = no

[evtx_download] enabled = no

[pre_script] enabled = yes

[during_script] enabled = yes

[web_reporting] enabled = no

[guacamole] enabled = no mode = vnc username = password = guacd_host = localhost guacd_port = 4822 You might need to add your server IP to ALLOWED_HOSTS in web/web/settings.py if it not ["*""] vnc or rdp

guest_protocol = vnc guacd_recording_path = /opt/CAPEv2/storage/guacrecordings guest_width = 1280 guest_height = 1024 rdp settings

guest_rdp_port = 3389

[packages] VM tags may be used to specify on which guest machines a sample should be run NOTE - One of the following OS version tags MUST be included for Windows VMs: winxp, win7, win8, win10, win11 Some samples will only detonate on specific versions of Windows (see web.conf packages for more info) Example: MSIX - Windows >= 10

msix = win10,win11

— Reply to this email directly, view it on GitHub https://github.com/kevoreilly/CAPEv2/issues/1506#issuecomment-1552427949, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH346NXLRYRI3KUJ4TYLXGWYEJANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.***>

musman12362 commented 1 year ago

Use cape properly as cape user, with poetry and thats all El jue, 18 may 2023 7:33, musman12362 @.> escribió: I'm trying to run cleaners.py but its not running and showing error please help me how i can resolve this error . I'm attaching screenshot just to show commands that is used to run cleaners.py. [image: Screenshot from 2023-05-18 05-15-04] https://user-images.githubusercontent.com/113498180/239147653-fc643343-a3f0-4940-96b5-d2a15b1362fc.png here are my configuration files 1. cuckoo.conf [cuckoo] Which category of tasks do you want to analyze? categories = static, pcap, url, file If turned on, Cuckoo will delete the original file after its analysis has been completed. delete_original = off Archives are not deleted by default, as it extracts and "original file" become extracted file delete_archive = on If turned on, Cuckoo will delete the copy of the original file in the local binaries repository after the analysis has finished. (On nix this will also invalidate the file called "binary" in each analysis directory, as this is a symlink.) delete_bin_copy = off Specify the name of the machinery module to use, this module will define the interaction between Cuckoo and your virtualization software of choice. machinery = kvm Enable creation of memory dump of the analysis machine before shutting down. Even if turned off, this functionality can also be enabled at submission. Currently available for: VirtualBox and libvirt modules (KVM). memory_dump = off When the timeout of an analysis is hit, the VM is just killed by default. For some long-running setups it might be interesting to terminate the moinitored processes before killing the VM so that connections are closed. terminate_processes = off Enable automatically re-schedule of "broken" tasks each startup. Each task found in status "processing" is re-queued for analysis. reschedule = off Fail "unserviceable" tasks as they are queued. Any task found that will never be analyzed based on the available analysis machines will have its status set to "failed". fail_unserviceable = on Limit the amount of analysis jobs a Cuckoo process goes through. This can be used together with a watchdog to mitigate risk of memory leaks. max_analysis_count = 10 Limit the number of concurrently executing analysis machines. This may be useful on systems with limited resources. Set to 0 to disable any limits. max_machines_count = 5 Limit the amount of VMs that are allowed to start in parallel. Generally speaking starting the VMs is one of the more CPU intensive parts of the actual analysis. This option tries to avoid maxing out the CPU completely. max_vmstartup_count = 5 Minimum amount of free space (in MB) available before starting a new task. This tries to avoid failing an analysis because the reports can't be written due out-of-diskspace errors. Setting this value to 0 disables the check. (Note: this feature is currently not supported under Windows.) freespace = 50000 Process tasks, but not reach out of memory freespace_processing = 15000 Temporary directory containing the files uploaded through Cuckoo interfaces (web.py, api.py, Django web interface). tmppath = /tmp Delta in days from current time to set the guest clocks to for file analyses A negative value sets the clock back, a positive value sets it forward. The default of 0 disables this option Note that this can still be overridden by the per-analysis clock setting and it is not performed by default for URL analysis as it will generally result in SSL errors daydelta = 0 Path to the unix socket for running root commands. rooter = /tmp/cuckoo-rooter Enable if you want to see a DEBUG log periodically containing backlog of pending tasks, locked vs unlocked machines. NOTE: Enabling this feature adds 4 database calls every 10 seconds. periodic_log = off Max filename length for submissions, before truncation. 196 is arbitrary. max_len = 196 If it is greater than this, call truncate the filename further for sanitizing purposes. Length truncated to is controlled by sanitize_to_len. This is to prevent long filenames such as files named by hash. sanitize_len = 32 sanitize_to_len = 24 [resultserver] The Result Server is used to receive in real time the behavioral logs produced by the analyzer. Specify the IP address of the host. The analysis machines should be able to contact the host through such address, so make sure it's valid. NOTE: if you set resultserver IP to 0.0.0.0 you have to set the option resultserver_ip for all your virtual machines in machinery configuration. ip = 192.168.10.129 Specify a port number to bind the result server on. port = 2042 Force the port chosen above, don't try another one (we can select another port dynamically if we can not bind this one, but that is not an option in some setups) force_port = yes pool_size = 0 Should the server write the legacy CSV format? (if you have any custom processing on those, switch this on) store_csvs = off Maximum size of uploaded files from VM (screenshots, dropped files, log) The value is expressed in bytes, by default 100MB. upload_max_size = 100000000 To enable trimming of huge binaries go to -> web.conf -> general -> enable_trim Prevent upload of files that passes upload_max_size? do_upload_max_size = no [processing] Set the maximum size of analyses generated files to process. This is used to avoid the processing of big files which may take a lot of processing time. The value is expressed in bytes, by default 200MB. analysis_size_limit = 200000000 Enable or disable DNS lookups. resolve_dns = on Enable or disable reverse DNS lookups This information currently is not displayed in the web interface reverse_dns = off Enable PCAP sorting, needed for the connection content view in the web interface. sort_pcap = on [database] Specify the database connection string. Examples, see documentation for more: sqlite:///foo.db **@.:5432/mydatabase @./mydatabase If empty, default is a SQLite in db/cuckoo.db. SQLite doens't support database upgrades! For production we strongly suggest go with PostgreSQL connection = @.:5432/cape Database connection timeout in seconds. If empty, default is set to 60 seconds. timeout = [timeouts] Set the default analysis timeout expressed in seconds. This value will be used to define after how many seconds the analysis will terminate unless otherwise specified at submission. default = 200 Set the critical timeout expressed in (relative!) seconds. It will be added to the default timeout above and after this timeout is hit Cuckoo will consider the analysis failed and it will shutdown the machine no matter what. When this happens the analysis results will most likely be lost. critical = 60 Maximum time to wait for virtual machine status change. For example when shutting down a vm. Default is 300 seconds. vm_state = 300 [tmpfs] only if you using volatility to speedup IO mkdir -p /mnt/tmpfs mount -t tmpfs -o size=50g ramfs /mnt/tmpfs chown cape:cape /mnt/tmpfs vim /etc/fstab tmpfs /mnt/tmpfs tmpfs nodev,nosuid,noexec,nodiratime,size=50g 0 0 Add crontab with @reboot https://github.com/reboot chown cape:cape /mnt/tmpfs -R enabled = off path = /mnt/tmpfs/ in mb freespace = 2000 1. auxiliary.conf Requires dependencies of software in vm as by: https://www.fireeye.com/blog/threat-research/2016/02/greater_visibilityt.html Windows 7 SP1, .NET at least 4.5, powershell 5 preferly over v4 KB3109118 - Script block logging back port update for WMF4 x64 - https://cuckoo.sh/vmcloak/Windows6.1-KB3109118-v4-x64.msu x32 - https://cuckoo.sh/vmcloak/Windows6.1-KB3109118-v4-x86.msu KB2819745 - WMF 4 (Windows Management Framework version 4) update for Windows 7 x64 - https://cuckoo.sh/vmcloak/Windows6.1-KB2819745-x64-MultiPkg.msu x32 - https://cuckoo.sh/vmcloak/Windows6.1-KB2819745-x86-MultiPkg.msu KB3191566 - https://www.microsoft.com/en-us/download/details.aspx?id=54616 You should create following registry entries reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\ModuleLogging\ModuleNames" /v /t REG_SZ /d /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\ScriptBlockLogging" /v EnableScriptBlockLogging /t REG_DWORD /d 00000001 /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\Transcription" /v EnableTranscripting /t REG_DWORD /d 00000001 /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\Transcription" /v OutputDirectory /t REG_SZ /d C:\PSTranscipts /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\Transcription" /v EnableInvocationHeader /t REG_DWORD /d 00000001 /f /reg:64 Modules to be enabled or not inside of the VM [auxiliary_modules] browser = yes curtain = no digisig = yes disguise = yes evtx = no human_windows = yes human_linux = no procmon = no screenshots_windows = yes screenshots_linux = yes sysmon = no tlsdump = yes usage = no file_pickup = no permissions = no pre_script = no during_script = no stap = no filecollector = yes This is only useful in case you use KVM's dnsmasq. You need to change your range inside of analyzer/windows/modules/auxiliary/disguise.py. Disguise must be enabled windows_static_route = no [sniffer] Enable or disable the use of an external sniffer (tcpdump) [yes/no]. enabled = yes enable remote tcpdump support remote = no host = @. Specify the path to your local installation of tcpdump. Make sure this path is correct. tcpdump = /usr/bin/tcpdump Specify the network interface name on which tcpdump should monitor the traffic. Make sure the interface is active. interface = ens33 #virbr1 Specify a Berkeley packet filter to pass to tcpdump. bpf = not arp [gateways] #RTR1 = 192.168.1.254 #RTR2 = 192.168.1.1 #INETSIM = 192.168.1.2 [virustotaldl] adds an option in the web interface to upload samples via VirusTotal downloads for a comma-separated list of MD5/SHA1/SHA256 hashes enabled = no note that unlike the VirusTotal processing module, the key required here is a Intelligence API key, not a Public API key #dlintelkey = SomeKeyWithDLAccess dlpath = /tmp/ 1. processing.conf Enable or disable the available processing modules [on/off]. If you add a custom processing module to your Cuckoo setup, you have to add a dedicated entry in this file, or it won't be executed. You can also add additional options under the section of your module and they will be available in your Python class. exclude files that doesn't match safe extension and ignore their files from processing inside of other modules like CAPE.py [antiransomware] enabled = no ignore all files with extension found more than X skip_number = 30 [curtain] enabled = no [sysmon] enabled = no [analysisinfo] enabled = yes FLARE capa -> to update rules utils/community.py -cr install -> cd /tmp && git clone --recurse-submodules https://github.com/fireeye/capa.git && cd capa && git submodule update --init rules && python -m pip3 install . [flare_capa] enabled = no Generate it always or generate on demand only(user need to click button to generate it), still should be enabled to use this feature on demand on_demand = no Analyze binary payloads static = no Analyze CAPE payloads cape = no Analyze ProcDump procdump = no [decompression] enabled = no [dumptls] enabled = no [behavior] enabled = yes Toggle specific modules within the BehaviorAnalysis class anomaly = yes processtree = yes summary = yes enhanced = yes encryptedbuffers = yes Should the server use a compressed version of behavioural logs? This helps in saving space in Mongo, accelerates searchs and reduce the size of the final JSON report. loop_detection = no The number of calls per process to process. 0 switches the limit off. 10000 api calls should be processed in less than 2 minutes analysis_call_limit = 0 Use ram to boost processing speed. You will need more than 20GB of RAM for this feature. Please read "performance" section in the documentation. ram_boost = no https://capev2.readthedocs.io/en/latest/usage/patterns_replacement.html replace_patterns = no [debug] enabled = yes [detections] enabled = yes Signatures behavior = yes yara = yes suricata = yes virustotal = yes clamav = no ... but this mechanism may still be switched on [procmemory] enabled = yes strings = yes [procmon] enabled = no [memory] enabled = no [usage] enabled = no [network] enabled = yes sort_pcap = no DNS whitelisting to ignore domains/IPs configured in network.py This should be disabled when utilizing InetSim/Remnux as we end up resolving the IP from fakedns which would then remove all domains associated with that resolved IP dnswhitelist = yes additional entries dnswhitelist_file = extra/whitelist_domains.txt ipwhitelist = yes ipwhitelist_file = extra/whitelist_ips.txt Requires geoip2 and maxmind database country_lookup = no Register and download for free from https://www.maxmind.com/ maxmind_database = data/GeoLite2-Country.mmdb [url_analysis] enabled = yes Enable a WHOIS lookup for the target domain of a URL analyses whois = yes [strings] enabled = yes on_demand = no nullterminated_only = no minchars = 5 [trid] Specify the path to the trid binary to use for static analysis. enabled = no identifier = data/trid/trid definitions = data/trid/triddefs.trd [die] Detect it Easy enabled = no binary = /usr/bin/diec [virustotal] enabled = yes on_demand = no timeout = 60 remove empty detections remove_empty = yes Add your VirusTotal API key here. The default API key, kindly provided by the VirusTotal team, should enable you with a sufficient throughput and while being shared with all our users, it shouldn't affect your use. key = abc61e5746de30f16bbff4d3c8e4ea177b6f6be0da14b66e02543786a6fd9ad4 do_file_lookup = yes do_url_lookup = yes urlscrub = (^http://serw.clicksor.com/redir.php?url=|&InjectedParam=.+$ <http://serw.clicksor.com/redir.php?url=%7C&InjectedParam=.+$>) [suricata] Notes on getting this to work check install_suricata function: https://github.com/doomedraven/Tools/blob/master/Sandbox/cape2.sh enabled = yes #Runmode "cli" or "socket" runmode = socket #Outputfiles if evelog is specified, it will be used instead of the per-protocol log files evelog = eve.json per-protocol log files #alertlog = alert.json #httplog = http.json #tlslog = tls.json #sshlog = ssh.json #dnslog = dns.json fileslog = files-json.log filesdir = files Amount of text to carve from plaintext files (bytes) buffer = 8192 #Used for creating an archive of extracted files 7zbin = /usr/bin/7z zippass = infected ##Runmode "cli" options bin = /usr/bin/suricata conf = /etc/suricata/suricata.yaml ##Runmode "socket" Options socket_file = /tmp/suricata-command.socket [cif] enabled = no url of CIF server url = https://your-cif-server.com/api CIF API key key = your-api-key-here time to wait for server to respond, in seconds timeout = 60 minimum confidence level of returned results: 25=not confident, 50=automated, 75=somewhat confident, 85=very confident, 95=certain defaults to 85 confidence = 85 don't log queries by default, set to 'no' to log queries nolog = yes max number of results per query per_lookup_limit = 20 max number of queries per analysis per_analysis_limit = 200 [CAPE] enabled = yes Ex targetinfo standalone module targetinfo = yes Ex dropped standalone module dropped = yes Ex procdump standalone module procdump = yes Amount of text to carve from plaintext files (bytes) buffer = 8192 Process files not bigger than value below in Mb. We saw that after 90Mb it has biggest delay max_file_size = 90 Scan for UserDB.TXT signature matches userdb_signature = no https://capev2.readthedocs.io/en/latest/usage/patterns_replacement.html replace_patterns = no Deduplicate screenshots You need to install dependency ImageHash>=4.2.1 [deduplication] Available hashs functions: ahash: Average hash phash: Perceptual hash dhash: Difference hash whash-haar: Haar wavelet hash whash-db4: Daubechies wavelet hash enabled = no hashmethod = ahash [vba2graph] Mac - brew install graphviz Ubuntu - sudo apt-get install graphviz Arch - sudo pacman -S graphviz+ sudo pip3 install networkx>=2.1 graphviz>=0.8.4 pydot>=1.2.4 enabled = yes on_demand = yes ja3 finger print db with descriptions https://github.com/trisulnsm/trisul-scripts/blob/master/lua/frontend_scripts/reassembly/ja3/prints/ja3fingerprint.json [ja3] ja3_path = data/ja3/ja3fingerprint.json [maliciousmacrobot] https://maliciousmacrobot.readthedocs.io Install mmbot sudo pip3 install mmbot Create/Set required paths Populate benign_path and malicious_path with appropriate macro maldocs (try the tests/samples in the github) https://github.com/egaus/MaliciousMacroBot/tree/master/tests/samples Create modeldata.pickle with your maldocs (this does not append to the model, it overwrites it) mmb = MaliciousMacroBot(benign_path, malicious_path, model_path, retain_sample_contents=False) result = mmb.mmb_init_model(modelRebuild=True) Copy your model file and vocab.txt to your model_path enabled = no benign_path = /opt/cuckoo/data/mmbot/benign malicious_path = /opt/cuckoo/data/mmbot/malicious model_path = /opt/cuckoo/data/mmbot/model [xlsdeobf] pip3 install git+ https://github.com/DissectMalware/XLMMacroDeobfuscator.git enabled = no on_demand = no [boxjs] enabled = no timeout = 60 url = http://your_super_box_js:9000 Extractors [mwcp] enabled = yes modules_path = modules/processing/parsers/mwcp/ [ratdecoders] enabled = yes modules_path = modules/processing/parsers/RATDecoders/ [malduck] enabled = yes modules_path = modules/processing/parsers/malduck/ [CAPE_extractors] enabled = yes Must ends with / modules_path = modules/processing/parsers/CAPE/ [reversinglabs] enabled = no url = key = [script_log_processing] enabled = yes Dump PE's overlay info [overlay] enabled = no [floss] enabled = no on_demand = yes static_strings = no stack_strings = yes decoded_strings = yes tight_strings = yes min_length = 5 Download FLOSS signatures from https://github.com/mandiant/flare-floss/tree/master/sigs sigs_path = data/flare-signatures 1. reporting.conf Enable or disable the available reporting modules [on/off]. If you add a custom reporting module to your Cuckoo setup, you have to add a dedicated entry in this file, or it won't be executed. You can also add additional options under the section of your module and they will be available in your Python class. [cents] enabled = no on_demand = no starting signature id for created Suricata rules start_sid = 1000000 [mitre] enabled = no https://github.com/geekscrapy/binGraph requires -> apt-get install python-tk [bingraph] enabled = yes on_demand = yes binary = yes geenrate bingraphs for cape/procdumps cape = yes procdump = yes [pcap2cert] enabled = yes [litereport] enabled = no keys_to_copy = CAPE procdump info signatures dropped static target network shot malscore ttps behavior_keys_to_copy = processtree summary [reportbackup] enabled = no External service to use googledrive = no Specify the ID of the shared Google Drive Folder where reports will be backed up to Replace folder ID with own Google Drive shared folder (share access to created service account) Without service account, upload process cannot complete due to browser not being able to launch drive_folder_id = id_here drive_credentials_location = data/google_creds.json [jsondump] enabled = yes indent = 4 encoding = latin-1 [reporthtml] Standalone report, not requires CAPE webgui enabled = no Include screenshots in report screenshots = no apicalls = no [reporthtmlsummary] much smaller, faster report generation, omits API logs and is non-interactive enabled = no Include screenshots in report screenshots = no [reportpdf] Note that this requires reporthtmlsummary to be enabled above as well enabled = no [maec41] enabled = no mode = overview processtree = true output_handles = false static = true strings = true virustotal = true deduplicate = true [maec5] enabled = no [mongodb] enabled = yes host = 127.0.0.1 port = 27017 db = cuckoo Set those values if you are using mongodb authentication username = password = authsource = cuckoo Set this value if you are using mongodb with TLS enabled tlscafile = Automatically delete large dict values that exceed mongos 16MB limitation Note: This only deletes dict keys from data stored in MongoDB. You would still get the full dataset if you parsed the results dict in another reporting module or from the jsondump module. fix_large_docs = yes ES is not officially supported by core dev and relays on community Latest known working version is 7.16.2 Use ElasticSearch as the "database" which powers Django. NOTE: If this is enabled, MongoDB should not be enabled, unless search only option is set to yes. Then elastic search is only used for /search web page. [elasticsearchdb] enabled = no searchonly = no host = 127.0.0.1 port = 9200 The report data is indexed in the form of {{index-yyyy.mm.dd}} so the below index configuration option is actually an index 'prefix'. index = cuckoo username = password = use_ssl = verify_certs = [retention] enabled = no run at most once every this many hours (unless reporting.conf is modified) run_every = 6 The amount of days old a task needs to be before deleting data Set a value to no to never delete it memory = 14 procmemory = 62 pcap = 62 sortedpcap = 14 bsonlogs = 62 dropped = 62 screencaps = 62 reports = 62 mongo = 731 elastic = no [syslog] enabled = no IP of your syslog server/listener host = x.x.x.x Port of your syslog server/listener port = 514 Protocol to send data over protocol = tcp Store a logfile? [in reports directory] logfile = yes if yes, what logname? [Default: syslog.txt] logname = syslog.log [moloch] enabled = no base = https://172.18.100.105:8005/ node = cuckoo3 capture = /data/moloch/bin/moloch-capture captureconf = /data/moloch/etc/config.ini user = admin pass = admin realm = Moloch [resubmitexe] enabled = no resublimit = 5 [compression] enabled = no zipmemdump = yes zipmemstrings = yes zipprocdump = yes zipprocstrings = yes [misp] enabled = no apikey = url = #Make event published after creation? published = no minimal malscore, by default all min_malscore = 0 by default 5 threads threads = this will retrieve information for iocs and activate misp report download from webgui extend_context = no upload iocs from cuckoo to MISP upload_iocs = no distribution = 0 threat_level_id = 2 analysis = 2 Sections to report Analysis ID will be appended, change title = Iocs from cuckoo analysis: network = no ids_files = no dropped = no registry = no mutexes = no [callback] enabled = no will send as post data {"task_id":X} can be coma separated urls url = http://IP/callback Compress results including CAPE output to help avoid reaching the hard 16MB MongoDB limit. [compressresults] enabled = no [tmpfsclean] enabled = no key = tr_extractor This calls the specified command, pointing it at the report.json as well as setting $ENV{CAPE_TASK_ID} to the task ID of the run in question. [zexecreport] enabled=no command=/foo/bar.pl run statistics, this may take more times. [runstatistics] enabled = no [malheur] enabled = no 1. web.conf Enable Django authentication/signup for website [web_auth] enabled = no You will also need to add django admin to make it working by running: poetry run python manage.py createsuperuser ReCaptcha protected admin login captcha = no 2fa = no To enable Oauth check https://django-allauth.readthedocs.io and web/web/settings.py. [registration] enabled = no manual_approve = yes email_required = yes email_confirmation = yes email_prefix_subject = "[CAPE Sandbox]" email_host = "" email_user = "" email_password = "" email_port = 465 use_ssl = 0 use_tls = 0 captcha_enabled = no Do you want to ban temporal email services? disposable_email_disable = yes disposable_domain_list = data/safelist/disposable_domain_list.txt [general] max_sample_size = 30000000 Try to trim huge binaries that bigger than max_sample_size or enable allow_ingore_size and specify that option enable_trim = no Required to be enabled and option set to ignore_size_check=1 allow_ignore_size = no Number of results to show on webgui on search action Intermediate solution, the ideal solution is pagination with cursor .skip(X).limit(Y) search_limit = 50 Allow anon users to browser site but not submit/download anon_viewable = no existent_tasks = yes top_detections = yes hostname of the cape instance hostname = https://127.0.0.1/ ;hostname = https://www.capesandbox.com/ Check if config exists or try to extract before accept task as static check_config_exists = no Assign architecture to task to fetch correct VM type dynamic_arch_determination = yes Assign platform to task to fetch correct VM type dynamic_platform_determination = yes Allow to download reports only to specific users, need to be activated in user profile, select checkbox near to "Reports" and set to "no" here reports_dl_allowed_to_all = yes Expose process log per task if enabled expose_process_log = no ratelimit for anon users [ratelimit] enabled = no rps = 1/rps rpm = 5/rpm Show submit to all VMs on webgui [all_vms] enabled = no [admin] enabled = no [comments] enabled = no #enable linux fields on webgui [linux] #For advanced users only, can be buggy, linux analysis is work in progress for fun enabled = no [malscore] enabled = yes [vtupload] Don't forget to set VT key in aux.conf under virustotaldl enabled = no #No means delete is disabled on webgui [delete] enabled = no #Dl'n'Exec analysis tab on submission [dlnexec] enabled = no #url analysis tab on submission [url_analysis] enabled = yes #TLP markings on submission and webgui [tlp] enabled = no #AMSI dump submission checkbox: can be useful to disable if no Win10+ instances #(amsidump is enabled by default in the monitor for Win10+) [amsidump] enabled = yes Limitation for public instances, api has no limits [public] enabled = no priority = 1 timeout = 300 Disable duplicated submissions for X hours [uniq_submission] enabled = no hours = 24 All providers can be found here https://django-allauth.readthedocs.io/en/latest/providers.html [oauth] amazon = no github = no gitlab = no twitter = no [display_browser_martians] enabled = no [display_office_martians] enabled = no [display_shrike] enabled = no [display_task_tags] displays custom tags, if set during sample submission enabled = no [expanded_dashboard] displays package, custom field, malfamily, clamav, PCAP link, and extended suricata results enabled = no [display_et_portal] enabled = no [display_pt_portal] enabled = no [zipped_download] enabled = yes zip_pwd = infected Allow to download all Dropped/Procdump/etc download_all = no [evtx_download] enabled = no [pre_script] enabled = yes [during_script] enabled = yes [web_reporting] enabled = no [guacamole] enabled = no mode = vnc username = password = guacd_host = localhost guacd_port = 4822 You might need to add your server IP to ALLOWED_HOSTS in web/web/settings.py if it not ["*""] vnc or rdp guest_protocol = vnc guacd_recording_path = /opt/CAPEv2/storage/guacrecordings guest_width = 1280 guest_height = 1024 rdp settings guest_rdp_port = 3389 [packages] VM tags may be used to specify on which guest machines a sample should be run NOTE - One of the following OS version tags MUST be included for Windows VMs: winxp, win7, win8, win10, win11 Some samples will only detonate on specific versions of Windows (see web.conf packages for more info) Example: MSIX - Windows >= 10 msix = win10,win11 — Reply to this email directly, view it on GitHub <#1506 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH346NXLRYRI3KUJ4TYLXGWYEJANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.***>

Thank you for clearing my confusion . i read document and they didnot show that we have to use poetry to run this script thats why I was executing it directly

doomedraven commented 1 year ago

You need to use poetry for anything in cape now, some partes of docs sadly is outdated

El jue, 18 may 2023 8:39, musman12362 @.***> escribió:

Use cape properly as cape user, with poetry and thats all El jue, 18 may 2023 7:33, musman12362 @.

*> escribió: … <#m-7659273266048413530> I'm trying to run cleaners.py but its not running and showing error please help me how i can resolve this error . I'm attaching screenshot just to show commands that is used to run cleaners.py. [image: Screenshot from 2023-05-18 05-15-04] https://user-images.githubusercontent.com/113498180/239147653-fc643343-a3f0-4940-96b5-d2a15b1362fc.png https://user-images.githubusercontent.com/113498180/239147653-fc643343-a3f0-4940-96b5-d2a15b1362fc.png here are my configuration files 1. cuckoo.conf [cuckoo] Which category of tasks do you want to analyze? categories = static, pcap, url, file If turned on, Cuckoo will delete the original file after its analysis has been completed. delete_original = off Archives are not deleted by default, as it extracts and "original file" become extracted file delete_archive = on If turned on, Cuckoo will delete the copy of the original file in the local binaries repository after the analysis has finished. (On nix this will also invalidate the file called "binary" in each analysis directory, as this is a symlink.) delete_bin_copy = off Specify the name of the machinery module to use, this module will define the interaction between Cuckoo and your virtualization software of choice. machinery = kvm Enable creation of memory dump of the analysis machine before shutting down. Even if turned off, this functionality can also be enabled at submission. Currently available for: VirtualBox and libvirt modules (KVM). memory_dump = off When the timeout of an analysis is hit, the VM is just killed by default. For some long-running setups it might be interesting to terminate the moinitored processes before killing the VM so that connections are closed. terminate_processes = off Enable automatically re-schedule of "broken" tasks each startup. Each task found in status "processing" is re-queued for analysis. reschedule = off Fail "unserviceable" tasks as they are queued. Any task found that will never be analyzed based on the available analysis machines will have its status set to "failed". fail_unserviceable = on Limit the amount of analysis jobs a Cuckoo process goes through. This can be used together with a watchdog to mitigate risk of memory leaks. max_analysis_count = 10 Limit the number of concurrently executing analysis machines. This may be useful on systems with limited resources. Set to 0 to disable any limits. max_machines_count = 5 Limit the amount of VMs that are allowed to start in parallel. Generally speaking starting the VMs is one of the more CPU intensive parts of the actual analysis. This option tries to avoid maxing out the CPU completely. max_vmstartup_count = 5 Minimum amount of free space (in MB) available before starting a new task. This tries to avoid failing an analysis because the reports can't be written due out-of-diskspace errors. Setting this value to 0 disables the check. (Note: this feature is currently not supported under Windows.) freespace = 50000 Process tasks, but not reach out of memory freespace_processing = 15000 Temporary directory containing the files uploaded through Cuckoo interfaces (web.py, api.py, Django web interface). tmppath = /tmp Delta in days from current time to set the guest clocks to for file analyses A negative value sets the clock back, a positive value sets it forward. The default of 0 disables this option Note that this can still be overridden by the per-analysis clock setting and it is not performed by default for URL analysis as it will generally result in SSL errors daydelta = 0 Path to the unix socket for running root commands. rooter = /tmp/cuckoo-rooter Enable if you want to see a DEBUG log periodically containing backlog of pending tasks, locked vs unlocked machines. NOTE: Enabling this feature adds 4 database calls every 10 seconds. periodic_log = off Max filename length for submissions, before truncation. 196 is arbitrary. max_len = 196 If it is greater than this, call truncate the filename further for sanitizing purposes. Length truncated to is controlled by sanitize_to_len. This is to prevent long filenames such as files named by hash. sanitize_len = 32 sanitize_to_len = 24 [resultserver] The Result Server is used to receive in real time the behavioral logs produced by the analyzer. Specify the IP address of the host. The analysis machines should be able to contact the host through such address, so make sure it's valid. NOTE: if you set resultserver IP to 0.0.0.0 you have to set the option resultserver_ip for all your virtual machines in machinery configuration. ip = 192.168.10.129 Specify a port number to bind the result server on. port = 2042 Force the port chosen above, don't try another one (we can select another port dynamically if we can not bind this one, but that is not an option in some setups) force_port = yes pool_size = 0 Should the server write the legacy CSV format? (if you have any custom processing on those, switch this on) store_csvs = off Maximum size of uploaded files from VM (screenshots, dropped files, log) The value is expressed in bytes, by default 100MB. upload_max_size = 100000000 To enable trimming of huge binaries go to -> web.conf -> general -> enable_trim Prevent upload of files that passes upload_max_size? do_upload_max_size = no [processing] Set the maximum size of analyses generated files to process. This is used to avoid the processing of big files which may take a lot of processing time. The value is expressed in bytes, by default 200MB. analysis_size_limit = 200000000 Enable or disable DNS lookups. resolve_dns = on Enable or disable reverse DNS lookups This information currently is not displayed in the web interface reverse_dns = off Enable PCAP sorting, needed for the connection content view in the web interface. sort_pcap = on [database] Specify the database connection string. Examples, see documentation for more: sqlite:///foo.db @.:5432/mydatabase @./mydatabase If empty, default is a SQLite in db/cuckoo.db. SQLite doens't support database upgrades! For production we strongly suggest go with PostgreSQL connection = @.:5432/cape Database connection timeout in seconds. If empty, default is set to 60 seconds. timeout = [timeouts] Set the default analysis timeout expressed in seconds. This value will be used to define after how many seconds the analysis will terminate unless otherwise specified at submission. default = 200 Set the critical timeout expressed in (relative!) seconds. It will be added to the default timeout above and after this timeout is hit Cuckoo will consider the analysis failed and it will shutdown the machine no matter what. When this happens the analysis results will most likely be lost. critical = 60 Maximum time to wait for virtual machine status change. For example when shutting down a vm. Default is 300 seconds. vm_state = 300 [tmpfs] only if you using volatility to speedup IO mkdir -p /mnt/tmpfs mount -t tmpfs -o size=50g ramfs /mnt/tmpfs chown cape:cape /mnt/tmpfs vim /etc/fstab tmpfs /mnt/tmpfs tmpfs nodev,nosuid,noexec,nodiratime,size=50g 0 0 Add crontab with @reboot https://github.com/reboot https://github.com/reboot https://github.com/reboot chown cape:cape /mnt/tmpfs -R enabled = off path = /mnt/tmpfs/ in mb freespace = 2000 1. auxiliary.conf Requires dependencies of software in vm as by: https://www.fireeye.com/blog/threat-research/2016/02/greater_visibilityt.html https://www.fireeye.com/blog/threat-research/2016/02/greater_visibilityt.html Windows 7 SP1, .NET at least 4.5, powershell 5 preferly over v4 KB3109118 - Script block logging back port update for WMF4 x64 - https://cuckoo.sh/vmcloak/Windows6.1-KB3109118-v4-x64.msu https://cuckoo.sh/vmcloak/Windows6.1-KB3109118-v4-x64.msu x32 - https://cuckoo.sh/vmcloak/Windows6.1-KB3109118-v4-x86.msu https://cuckoo.sh/vmcloak/Windows6.1-KB3109118-v4-x86.msu KB2819745 - WMF 4 (Windows Management Framework version 4) update for Windows 7 x64 - https://cuckoo.sh/vmcloak/Windows6.1-KB2819745-x64-MultiPkg.msu https://cuckoo.sh/vmcloak/Windows6.1-KB2819745-x64-MultiPkg.msu x32 - https://cuckoo.sh/vmcloak/Windows6.1-KB2819745-x86-MultiPkg.msu https://cuckoo.sh/vmcloak/Windows6.1-KB2819745-x86-MultiPkg.msu KB3191566

  • https://www.microsoft.com/en-us/download/details.aspx?id=54616 https://www.microsoft.com/en-us/download/details.aspx?id=54616 You should create following registry entries reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\ModuleLogging\ModuleNames" /v /t REG_SZ /d /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\ScriptBlockLogging" /v EnableScriptBlockLogging /t REG_DWORD /d 00000001 /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\Transcription" /v EnableTranscripting /t REG_DWORD /d 00000001 /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\Transcription" /v OutputDirectory /t REG_SZ /d C:\PSTranscipts /f /reg:64 reg add "HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\PowerShell\Transcription" /v EnableInvocationHeader /t REG_DWORD /d 00000001 /f /reg:64 Modules to be enabled or not inside of the VM [auxiliary_modules] browser = yes curtain = no digisig = yes disguise = yes evtx = no human_windows = yes human_linux = no procmon = no screenshots_windows = yes screenshots_linux = yes sysmon = no tlsdump = yes usage = no file_pickup = no permissions = no pre_script = no during_script = no stap = no filecollector = yes This is only useful in case you use KVM's dnsmasq. You need to change your range inside of analyzer/windows/modules/auxiliary/disguise.py. Disguise must be enabled windows_static_route = no [sniffer] Enable or disable the use of an external sniffer (tcpdump) [yes/no]. enabled = yes enable remote tcpdump support remote = no host = @.*** Specify the path to your local installation of tcpdump. Make sure this path is correct. tcpdump = /usr/bin/tcpdump Specify the network interface name on which tcpdump should monitor the traffic. Make sure the interface is active. interface = ens33

    virbr1 Specify a Berkeley packet filter to pass to tcpdump. bpf = not arp

    [gateways] #RTR1 = 192.168.1.254 #RTR2 = 192.168.1.1 #INETSIM = 192.168.1.2 [virustotaldl] adds an option in the web interface to upload samples via VirusTotal downloads for a comma-separated list of MD5/SHA1/SHA256 hashes enabled = no note that unlike the VirusTotal processing module, the key required here is a Intelligence API key, not a Public API key #dlintelkey = SomeKeyWithDLAccess dlpath = /tmp/ 1. processing.conf Enable or disable the available processing modules [on/off]. If you add a custom processing module to your Cuckoo setup, you have to add a dedicated entry in this file, or it won't be executed. You can also add additional options under the section of your module and they will be available in your Python class. exclude files that doesn't match safe extension and ignore their files from processing inside of other modules like CAPE.py [antiransomware] enabled = no ignore all files with extension found more than X skip_number = 30 [curtain] enabled = no [sysmon] enabled = no [analysisinfo] enabled = yes FLARE capa -> to update rules utils/community.py -cr install -> cd /tmp && git clone --recurse-submodules https://github.com/fireeye/capa.git && cd capa && git submodule update --init rules && python -m pip3 install . [flare_capa] enabled = no Generate it always or generate on demand only(user need to click button to generate it), still should be enabled to use this feature on demand on_demand = no Analyze binary payloads static = no Analyze CAPE payloads cape = no Analyze ProcDump procdump = no [decompression] enabled = no [dumptls] enabled = no [behavior] enabled = yes Toggle specific modules within the BehaviorAnalysis class anomaly = yes processtree = yes summary = yes enhanced = yes encryptedbuffers = yes Should the server use a compressed version of behavioural logs? This helps in saving space in Mongo, accelerates searchs and reduce the size of the final JSON report. loop_detection = no The number of calls per process to process. 0 switches the limit off. 10000 api calls should be processed in less than 2 minutes analysis_call_limit = 0 Use ram to boost processing speed. You will need more than 20GB of RAM for this feature. Please read "performance" section in the documentation. ram_boost = no https://capev2.readthedocs.io/en/latest/usage/patterns_replacement.html replace_patterns = no [debug] enabled = yes [detections] enabled = yes Signatures behavior = yes yara = yes suricata = yes virustotal = yes clamav = no ... but this mechanism may still be switched on [procmemory] enabled = yes strings = yes [procmon] enabled = no [memory] enabled = no [usage] enabled = no [network] enabled = yes sort_pcap = no DNS whitelisting to ignore domains/IPs configured in network.py This should be disabled when utilizing InetSim/Remnux as we end up resolving the IP from fakedns which would then remove all domains associated with that resolved IP dnswhitelist = yes additional entries dnswhitelist_file = extra/whitelist_domains.txt ipwhitelist = yes ipwhitelist_file = extra/whitelist_ips.txt Requires geoip2 and maxmind database country_lookup = no Register and download for free from https://www.maxmind.com/ maxmind_database = data/GeoLite2-Country.mmdb [url_analysis] enabled = yes Enable a WHOIS lookup for the target domain of a URL analyses whois = yes [strings] enabled = yes on_demand = no nullterminated_only = no minchars = 5 [trid] Specify the path to the trid binary to use for static analysis. enabled = no identifier = data/trid/trid definitions = data/trid/triddefs.trd [die] Detect it Easy enabled = no binary = /usr/bin/diec [virustotal] enabled = yes on_demand = no timeout = 60 remove empty detections remove_empty = yes Add your VirusTotal API key here. The default API key, kindly provided by the VirusTotal team, should enable you with a sufficient throughput and while being shared with all our users, it shouldn't affect your use. key = abc61e5746de30f16bbff4d3c8e4ea177b6f6be0da14b66e02543786a6fd9ad4 do_file_lookup = yes do_url_lookup = yes urlscrub = (^ http://serw.clicksor.com/redir.php?url=|&InjectedParam=.+$ http://serw.clicksor.com/redir.php?url=%7C&InjectedParam=.+%24 < http://serw.clicksor.com/redir.php?url=%7C&InjectedParam=.+$ http://serw.clicksor.com/redir.php?url=%7C&InjectedParam=.+%24>) [suricata] Notes on getting this to work check install_suricata function: https://github.com/doomedraven/Tools/blob/master/Sandbox/cape2.sh enabled = yes #Runmode "cli" or "socket" runmode = socket #Outputfiles if evelog is specified, it will be used instead of the per-protocol log files evelog = eve.json per-protocol log files #alertlog = alert.json #httplog = http.json

    tlslog = tls.json #sshlog = ssh.json #dnslog = dns.json fileslog =

    files-json.log filesdir = files Amount of text to carve from plaintext files (bytes) buffer = 8192 #Used for creating an archive of extracted files 7zbin = /usr/bin/7z zippass = infected ##Runmode "cli" options bin = /usr/bin/suricata conf = /etc/suricata/suricata.yaml ##Runmode "socket" Options socket_file = /tmp/suricata-command.socket [cif] enabled = no url of CIF server url = https://your-cif-server.com/api CIF API key key = your-api-key-here time to wait for server to respond, in seconds timeout = 60 minimum confidence level of returned results: 25=not confident, 50=automated, 75=somewhat confident, 85=very confident, 95=certain defaults to 85 confidence = 85 don't log queries by default, set to 'no' to log queries nolog = yes max number of results per query per_lookup_limit = 20 max number of queries per analysis per_analysis_limit = 200 [CAPE] enabled = yes Ex targetinfo standalone module targetinfo = yes Ex dropped standalone module dropped = yes Ex procdump standalone module procdump = yes Amount of text to carve from plaintext files (bytes) buffer = 8192 Process files not bigger than value below in Mb. We saw that after 90Mb it has biggest delay max_file_size = 90 Scan for UserDB.TXT signature matches userdb_signature = no https://capev2.readthedocs.io/en/latest/usage/patterns_replacement.html replace_patterns = no Deduplicate screenshots You need to install dependency ImageHash>=4.2.1 [deduplication] Available hashs functions: ahash: Average hash phash: Perceptual hash dhash: Difference hash whash-haar: Haar wavelet hash whash-db4: Daubechies wavelet hash enabled = no hashmethod = ahash [vba2graph] Mac - brew install graphviz Ubuntu - sudo apt-get install graphviz Arch - sudo pacman -S graphviz+ sudo pip3 install networkx>=2.1 graphviz>=0.8.4 pydot>=1.2.4 enabled = yes on_demand = yes ja3 finger print db with descriptions https://github.com/trisulnsm/trisul-scripts/blob/master/lua/frontend_scripts/reassembly/ja3/prints/ja3fingerprint.json [ja3] ja3_path = data/ja3/ja3fingerprint.json [maliciousmacrobot] https://maliciousmacrobot.readthedocs.io Install mmbot sudo pip3 install mmbot Create/Set required paths Populate benign_path and malicious_path with appropriate macro maldocs (try the tests/samples in the github) https://github.com/egaus/MaliciousMacroBot/tree/master/tests/samples Create modeldata.pickle with your maldocs (this does not append to the model, it overwrites it) mmb = MaliciousMacroBot(benign_path, malicious_path, model_path, retain_sample_contents=False) result = mmb.mmb_init_model(modelRebuild=True) Copy your model file and vocab.txt to your model_path enabled = no benign_path = /opt/cuckoo/data/mmbot/benign malicious_path = /opt/cuckoo/data/mmbot/malicious model_path = /opt/cuckoo/data/mmbot/model [xlsdeobf] pip3 install git+ https://github.com/DissectMalware/XLMMacroDeobfuscator.git enabled = no on_demand = no [boxjs] enabled = no timeout = 60 url = http://your_super_box_js:9000 Extractors [mwcp] enabled = yes modules_path = modules/processing/parsers/mwcp/ [ratdecoders] enabled = yes modules_path = modules/processing/parsers/RATDecoders/ [malduck] enabled = yes modules_path = modules/processing/parsers/malduck/ [CAPE_extractors] enabled = yes Must ends with / modules_path = modules/processing/parsers/CAPE/ [reversinglabs] enabled = no url = key = [script_log_processing] enabled = yes Dump PE's overlay info [overlay] enabled = no [floss] enabled = no on_demand = yes static_strings = no stack_strings = yes decoded_strings = yes tight_strings = yes min_length = 5 Download FLOSS signatures from https://github.com/mandiant/flare-floss/tree/master/sigs sigs_path = data/flare-signatures 1. reporting.conf Enable or disable the available reporting modules [on/off]. If you add a custom reporting module to your Cuckoo setup, you have to add a dedicated entry in this file, or it won't be executed. You can also add additional options under the section of your module and they will be available in your Python class. [cents] enabled = no on_demand = no starting signature id for created Suricata rules start_sid = 1000000 [mitre] enabled = no https://github.com/geekscrapy/binGraph requires -> apt-get install python-tk [bingraph] enabled = yes on_demand = yes binary = yes geenrate bingraphs for cape/procdumps cape = yes procdump = yes [pcap2cert] enabled = yes [litereport] enabled = no keys_to_copy = CAPE procdump info signatures dropped static target network shot malscore ttps behavior_keys_to_copy = processtree summary [reportbackup] enabled = no External service to use googledrive = no Specify the ID of the shared Google Drive Folder where reports will be backed up to Replace folder ID with own Google Drive shared folder (share access to created service account) Without service account, upload process cannot complete due to browser not being able to launch drive_folder_id = id_here drive_credentials_location = data/google_creds.json [jsondump] enabled = yes indent = 4 encoding = latin-1 [reporthtml] Standalone report, not requires CAPE webgui enabled = no Include screenshots in report screenshots = no apicalls = no [reporthtmlsummary] much smaller, faster report generation, omits API logs and is non-interactive enabled = no Include screenshots in report screenshots = no [reportpdf] Note that this requires reporthtmlsummary to be enabled above as well enabled = no [maec41] enabled = no mode = overview processtree = true output_handles = false static = true strings = true virustotal = true deduplicate = true [maec5] enabled = no [mongodb] enabled = yes host = 127.0.0.1 port = 27017 db = cuckoo Set those values if you are using mongodb authentication username = password = authsource = cuckoo Set this value if you are using mongodb with TLS enabled tlscafile = Automatically delete large dict values that exceed mongos 16MB limitation Note: This only deletes dict keys from data stored in MongoDB. You would still get the full dataset if you parsed the results dict in another reporting module or from the jsondump module. fix_large_docs = yes ES is not officially supported by core dev and relays on community Latest known working version is 7.16.2 Use ElasticSearch as the "database" which powers Django. NOTE: If this is enabled, MongoDB should not be enabled, unless search only option is set to yes. Then elastic search is only used for /search web page. [elasticsearchdb] enabled = no searchonly = no host = 127.0.0.1 port = 9200 The report data is indexed in the form of {{index-yyyy.mm.dd}} so the below index configuration option is actually an index 'prefix'. index = cuckoo username = password = use_ssl = verify_certs = [retention] enabled = no run at most once every this many hours (unless reporting.conf is modified) run_every = 6 The amount of days old a task needs to be before deleting data Set a value to no to never delete it memory = 14 procmemory = 62 pcap = 62 sortedpcap = 14 bsonlogs = 62 dropped = 62 screencaps = 62 reports = 62 mongo = 731 elastic = no [syslog] enabled = no IP of your syslog server/listener host = x.x.x.x Port of your syslog server/listener port = 514 Protocol to send data over protocol = tcp Store a logfile? [in reports directory] logfile = yes if yes, what logname? [Default: syslog.txt] logname = syslog.log [moloch] enabled = no base = https://172.18.100.105:8005/ node = cuckoo3 capture = /data/moloch/bin/moloch-capture captureconf = /data/moloch/etc/config.ini user = admin pass = admin realm = Moloch [resubmitexe] enabled = no resublimit = 5 [compression] enabled = no zipmemdump = yes zipmemstrings = yes zipprocdump = yes zipprocstrings = yes [misp] enabled = no apikey = url = #Make event published after creation? published = no minimal malscore, by default all min_malscore = 0 by default 5 threads threads = this will retrieve information for iocs and activate misp report download from webgui extend_context = no upload iocs from cuckoo to MISP upload_iocs = no distribution = 0 threat_level_id = 2 analysis = 2 Sections to report Analysis ID will be appended, change title = Iocs from cuckoo analysis: network = no ids_files = no dropped = no registry = no mutexes = no [callback] enabled = no will send as post data {"task_id":X} can be coma separated urls url = http://IP/callback Compress results including CAPE output to help avoid reaching the hard 16MB MongoDB limit. [compressresults] enabled = no [tmpfsclean] enabled = no key = tr_extractor This calls the specified command, pointing it at the report.json as well as setting $ENV{CAPE_TASK_ID} to the task ID of the run in question. [zexecreport] enabled=no command=/foo/bar.pl run statistics, this may take more times. [runstatistics] enabled = no [malheur] enabled = no 1. web.conf Enable Django authentication/signup for website [web_auth] enabled = no You will also need to add django admin to make it working by running: poetry run python manage.py createsuperuser ReCaptcha protected admin login captcha = no 2fa = no To enable Oauth check https://django-allauth.readthedocs.io and web/web/settings.py. [registration] enabled = no manual_approve = yes email_required = yes email_confirmation = yes email_prefix_subject = "[CAPE Sandbox]" email_host = "" email_user = "" email_password = "" email_port = 465 use_ssl = 0 use_tls = 0 captcha_enabled = no Do you want to ban temporal email services? disposable_email_disable = yes disposable_domain_list = data/safelist/disposable_domain_list.txt [general] max_sample_size = 30000000 Try to trim huge binaries that bigger than max_sample_size or enable allow_ingore_size and specify that option enable_trim = no Required to be enabled and option set to ignore_size_check=1 allow_ignore_size = no Number of results to show on webgui on search action Intermediate solution, the ideal solution is pagination with cursor .skip(X).limit(Y) search_limit = 50 Allow anon users to browser site but not submit/download anon_viewable = no existent_tasks = yes top_detections = yes hostname of the cape instance hostname = https://127.0.0.1/ ;hostname = https://www.capesandbox.com/ Check if config exists or try to extract before accept task as static check_config_exists = no Assign architecture to task to fetch correct VM type dynamic_arch_determination = yes Assign platform to task to fetch correct VM type dynamic_platform_determination = yes Allow to download reports only to specific users, need to be activated in user profile, select checkbox near to "Reports" and set to "no" here reports_dl_allowed_to_all = yes Expose process log per task if enabled expose_process_log = no ratelimit for anon users [ratelimit] enabled = no rps = 1/rps rpm = 5/rpm Show submit to all VMs on webgui [all_vms] enabled = no [admin] enabled = no [comments] enabled = no #enable linux fields on webgui [linux] #For advanced users only, can be buggy, linux analysis is work in progress for fun enabled = no [malscore] enabled = yes [vtupload] Don't forget to set VT key in aux.conf under virustotaldl enabled = no #No means delete is disabled on webgui [delete] enabled = no #Dl'n'Exec analysis tab on submission [dlnexec] enabled = no #url analysis tab on submission [url_analysis] enabled = yes #TLP markings on submission and webgui [tlp] enabled = no #AMSI dump submission checkbox: can be useful to disable if no Win10+ instances #(amsidump is enabled by default in the monitor for Win10+) [amsidump] enabled = yes Limitation for public instances, api has no limits [public] enabled = no priority = 1 timeout = 300 Disable duplicated submissions for X hours [uniq_submission] enabled = no hours = 24 All providers can be found here https://django-allauth.readthedocs.io/en/latest/providers.html [oauth] amazon = no github = no gitlab = no twitter = no [display_browser_martians] enabled = no [display_office_martians] enabled = no [display_shrike] enabled = no [display_task_tags] displays custom tags, if set during sample submission enabled = no [expanded_dashboard] displays package, custom field, malfamily, clamav, PCAP link, and extended suricata results enabled = no [display_et_portal] enabled = no [display_pt_portal] enabled = no [zipped_download] enabled = yes zip_pwd = infected Allow to download all Dropped/Procdump/etc download_all = no [evtx_download] enabled = no [pre_script] enabled = yes [during_script] enabled = yes [web_reporting] enabled = no [guacamole] enabled = no mode = vnc username = password = guacd_host = localhost guacd_port = 4822 You might need to add your server IP to ALLOWED_HOSTS in web/web/settings.py if it not ["*""] vnc or rdp guest_protocol = vnc guacd_recording_path = /opt/CAPEv2/storage/guacrecordings guest_width = 1280 guest_height = 1024 rdp settings guest_rdp_port = 3389 [packages] VM tags may be used to specify on which guest machines a sample should be run NOTE - One of the following OS version tags MUST be included for Windows VMs: winxp, win7, win8, win10, win11 Some samples will only detonate on specific versions of Windows (see web.conf packages for more info) Example: MSIX - Windows >= 10 msix = win10,win11 — Reply to this email directly, view it on GitHub <#1506 (comment) https://github.com/kevoreilly/CAPEv2/issues/1506#issuecomment-1552427949>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH346NXLRYRI3KUJ4TYLXGWYEJANCNFSM6AAAAAAXWXZHVA https://github.com/notifications/unsubscribe-auth/AAOFH346NXLRYRI3KUJ4TYLXGWYEJANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.***>

Thank you for clearing my confusion . i read document and they didnot show that we have to use poetry to run this script thats why I was executing it directly

— Reply to this email directly, view it on GitHub https://github.com/kevoreilly/CAPEv2/issues/1506#issuecomment-1552542520, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH3YB5KR2BJWC65U3KA3XGW7ZXANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.***>

musman12362 commented 1 year ago

I just submiited a BARAKAT.xlsx file but my analyzer suddenly get stop by showing this warning and its not giving me proper result can you please help me out because i want to get deep analysis of this file.

i'm also attaching file here

LOG

2023-05-25 06:32:33,042 [lib.cuckoo.core.resultserver] DEBUG: Task #11: Trying to upload file shots/0001.jpg DEBUG:lib.cuckoo.core.resultserver:Task #11: Uploaded file shots/0001.jpg of length: 170347 2023-05-25 06:32:33,122 [lib.cuckoo.core.resultserver] DEBUG: Task #11: Uploaded file shots/0001.jpg of length: 170347 WARNING:lib.cuckoo.core.guest:Task #11: Analysis caught an exception (id=win10, ip=192.168.122.100) You probably submitted the job with wrong package 2023-05-25 06:32:33,894 [lib.cuckoo.core.guest] WARNING: Task #11: Analysis caught an exception (id=win10, ip=192.168.122.100) You probably submitted the job with wrong package DEBUG:lib.cuckoo.core.plugins:Stopped auxiliary module: Sniffer 2023-05-25 06:32:33,996 [lib.cuckoo.core.plugins] DEBUG: Stopped auxiliary module: Sniffer DEBUG:lib.cuckoo.common.abstracts:Stopping machine win10 2023-05-25 06:32:34,002 [lib.cuckoo.common.abstracts] DEBUG: Stopping machine win10 DEBUG:lib.cuckoo.common.abstracts:Getting status for win10 2023-05-25 06:32:34,004 [lib.cuckoo.common.abstracts] DEBUG: Getting status for win10 DEBUG:lib.cuckoo.common.abstracts:Getting status for win10 2023-05-25 06:32:36,824 [lib.cuckoo.common.abstracts] DEBUG: Getting status for win10 DEBUG:lib.cuckoo.core.resultserver:Task #11: Stopped tracking machine 192.168.122.100 2023-05-25 06:32:36,838 [lib.cuckoo.core.resultserver] DEBUG: Task #11: Stopped tracking machine 192.168.122.100 DEBUG:lib.cuckoo.core.resultserver:Task #11: Cancel <Context for b'LOG'> 2023-05-25 06:32:36,839 [lib.cuckoo.core.resultserver] DEBUG: Task #11: Cancel <Context for b'LOG'> INFO:lib.cuckoo.core.scheduler:Disabled route 'internet' 2023-05-25 06:32:36,866 [lib.cuckoo.core.scheduler] INFO: Disabled route 'internet' DEBUG:lib.cuckoo.core.scheduler:Task #11: Released database task with status True 2023-05-25 06:32:36,898 [lib.cuckoo.core.scheduler] DEBUG: Task #11: Released database task with status True INFO:lib.cuckoo.core.scheduler:Task #11: analysis procedure completed 2023-05-25 06:32:36,899 [lib.cuckoo.core.scheduler] INFO: Task #11: analysis procedure completed 8c86faf2875b1b94de7948f687cb80cf50d71f93f3f23262f8203e7d1e141b43.zip

I dowload this file from here

https://app.docguard.io/8c86faf2875b1b94de7948f687cb80cf50d71f93f3f23262f8203e7d1e141b43/results/dashboard

doomedraven commented 1 year ago

check your analysis log inside of the storage/analyses/11/analysis.log something like that

musman12362 commented 1 year ago

hey I hope you are fine . I need your assistant , actually i have one file which 100% confirmed malicious. but the issue is that this file is anti-sandbox and when i submit this file for analysis it detect that it is under analysis and doesnot allow us to run analyze it .

can you please guide me someway to analyze this file . and can you please check this file in your sandbox to check the result. I'll be very thankful to you for your support .

here is the file: 8c86faf2875b1b94de7948f687cb80cf50d71f93f3f23262f8203e7d1e141b43.zip

doomedraven commented 1 year ago

use al-khaser for generic detection, but you must have bases of malware analysis to see what it checks to understand how to defeat it

doomedraven commented 1 year ago

it works just fine in capesandbox https://capesandbox.com/analysis/396022/ so use al-khaser and if that doesn't work, you need to do manual analysis

musman12362 commented 1 year ago

i configured cape on my machine . but it doesnot show much details like your sandbox analysis . It just detect single signature but your sandbox detect multiple signatures and my cape sanbox doesnot Screenshot from 2023-06-06 12-47-30 provide me a single summary detail I'm attaching screenshot of my analysis report . I cannot share pdf report because i'm also facing problem in generating pdf report file not found.

should i share my configuration files with you?

Screenshot from 2023-06-06 12-43-19

Screenshot from 2023-06-06 12-42-50

doomedraven commented 1 year ago

Did you use utils/community.py?

El mié, 7 jun 2023, 6:34, musman12362 @.***> escribió:

i configured cape on my machine . but it doesnot show much details like your sandbox analysis . It just detect single signature but your sandbox detect multiple signatures and my cape sanbox doesnot [image: Screenshot from 2023-06-06 12-47-30] https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png provide me a single summary detail I'm attaching screenshot of my analysis report . I cannot share pdf report because i'm also facing problem in generating pdf report file not found.

should i share my configuration files with you?

[image: Screenshot from 2023-06-06 12-43-19] https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png

[image: Screenshot from 2023-06-06 12-42-50] https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png

— Reply to this email directly, view it on GitHub https://github.com/kevoreilly/CAPEv2/issues/1506#issuecomment-1579868519, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.***>

musman12362 commented 1 year ago

Did you use utils/community.py? El mié, 7 jun 2023, 6:34, musman12362 @.> escribió: i configured cape on my machine . but it doesnot show much details like your sandbox analysis . It just detect single signature but your sandbox detect multiple signatures and my cape sanbox doesnot [image: Screenshot from 2023-06-06 12-47-30] https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png provide me a single summary detail I'm attaching screenshot of my analysis report . I cannot share pdf report because i'm also facing problem in generating pdf report file not found. should i share my configuration files with you? [image: Screenshot from 2023-06-06 12-43-19] https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png [image: Screenshot from 2023-06-06 12-42-50] https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png — Reply to this email directly, view it on GitHub <#1506 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.>

no i didnot used that . how i can use that ? should i have to run it manually by using poetry?

doomedraven commented 1 year ago

Poetry run python utils/community.py -waf Read docs

El mié, 7 jun 2023, 7:36, musman12362 @.***> escribió:

Did you use utils/community.py? El mié, 7 jun 2023, 6:34, musman12362 @.

> escribió: … <#m3068504836886773427> i configured cape on my machine . but it doesnot show much details like your sandbox analysis . It just detect single signature but your sandbox detect multiple signatures and my cape sanbox doesnot [image: Screenshot from 2023-06-06 12-47-30] https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png provide me a single summary detail I'm attaching screenshot of my analysis report . I cannot share pdf report because i'm also facing problem in generating pdf report file not found. should i share my configuration files with you? [image: Screenshot from 2023-06-06 12-43-19] https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png [image: Screenshot from 2023-06-06 12-42-50] https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png — Reply to this email directly, view it on GitHub <#1506 (comment) https://github.com/kevoreilly/CAPEv2/issues/1506#issuecomment-1579868519>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.>

no i didnot used that . how i can use that ? should i have to run it manually by using poetry?

— Reply to this email directly, view it on GitHub https://github.com/kevoreilly/CAPEv2/issues/1506#issuecomment-1579934170, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH35ZPRG6HMSGIVP57ALXKAHMVANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.***>

musman12362 commented 1 year ago

Poetry run python utils/community.py -waf Read docs El mié, 7 jun 2023, 7:36, musman12362 @.> escribió: Did you use utils/community.py? El mié, 7 jun 2023, 6:34, musman12362 @. > escribió: … <#m3068504836886773427> i configured cape on my machine . but it doesnot show much details like your sandbox analysis . It just detect single signature but your sandbox detect multiple signatures and my cape sanbox doesnot [image: Screenshot from 2023-06-06 12-47-30] https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png provide me a single summary detail I'm attaching screenshot of my analysis report . I cannot share pdf report because i'm also facing problem in generating pdf report file not found. should i share my configuration files with you? [image: Screenshot from 2023-06-06 12-43-19] https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png [image: Screenshot from 2023-06-06 12-42-50] https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png — Reply to this email directly, view it on GitHub <#1506 (comment) <#1506 (comment)>>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.> no i didnot used that . how i can use that ? should i have to run it manually by using poetry? — Reply to this email directly, view it on GitHub <#1506 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH35ZPRG6HMSGIVP57ALXKAHMVANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.>

Thank you for your support let me try this.

musman12362 commented 1 year ago

Poetry run python utils/community.py -waf Read docs El mié, 7 jun 2023, 7:36, musman12362 @.> escribió: Did you use utils/community.py? El mié, 7 jun 2023, 6:34, musman12362 @. > escribió: … <#m3068504836886773427> i configured cape on my machine . but it doesnot show much details like your sandbox analysis . It just detect single signature but your sandbox detect multiple signatures and my cape sanbox doesnot [image: Screenshot from 2023-06-06 12-47-30] https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png provide me a single summary detail I'm attaching screenshot of my analysis report . I cannot share pdf report because i'm also facing problem in generating pdf report file not found. should i share my configuration files with you? [image: Screenshot from 2023-06-06 12-43-19] https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png [image: Screenshot from 2023-06-06 12-42-50] https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png — Reply to this email directly, view it on GitHub <#1506 (comment) <#1506 (comment)>>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.> no i didnot used that . how i can use that ? should i have to run it manually by using poetry? — Reply to this email directly, view it on GitHub <#1506 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH35ZPRG6HMSGIVP57ALXKAHMVANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.>

I used community.py but I still got same result as mentioned in the previous

i configured cape on my machine . but it doesnot show much details like your sandbox analysis . It just detect single signature but your sandbox detect multiple signatures and my cape sanbox doesnot Screenshot from 2023-06-06 12-47-30 provide me a single summary detail I'm attaching screenshot of my analysis report . I cannot share pdf report because i'm also facing problem in generating pdf report file not found.

should i share my configuration files with you?

Screenshot from 2023-06-06 12-43-19

Screenshot from 2023-06-06 12-42-50

now I used community.py but I still got same result as mentioned above

doomedraven commented 1 year ago

Well idk if you restarted processing, if your vm is bad, there is many reason, you need to investiga te your side

El mié, 7 jun 2023, 8:22, musman12362 @.***> escribió:

Poetry run python utils/community.py -waf Read docs El mié, 7 jun 2023, 7:36, musman12362 @.

> escribió: … <#m5563946040685360809> Did you use utils/community.py? El mié, 7 jun 2023, 6:34, musman12362 @. > escribió: … <#m3068504836886773427> i configured cape on my machine . but it doesnot show much details like your sandbox analysis . It just detect single signature but your sandbox detect multiple signatures and my cape sanbox doesnot [image: Screenshot from 2023-06-06 12-47-30] https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png provide me a single summary detail I'm attaching screenshot of my analysis report . I cannot share pdf report because i'm also facing problem in generating pdf report file not found. should i share my configuration files with you? [image: Screenshot from 2023-06-06 12-43-19] https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png [image: Screenshot from 2023-06-06 12-42-50] https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png — Reply to this email directly, view it on GitHub <#1506 https://github.com/kevoreilly/CAPEv2/issues/1506 (comment) <#1506 (comment) https://github.com/kevoreilly/CAPEv2/issues/1506#issuecomment-1579868519>>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA https://github.com/notifications/unsubscribe-auth/AAOFH33WAAQH7PN7QAICKBDXKAAHHANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.> no i didnot used that . how i can use that ? should i have to run it manually by using poetry? — Reply to this email directly, view it on GitHub <#1506 (comment) https://github.com/kevoreilly/CAPEv2/issues/1506#issuecomment-1579934170>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH35ZPRG6HMSGIVP57ALXKAHMVANCNFSM6AAAAAAXWXZHVA https://github.com/notifications/unsubscribe-auth/AAOFH35ZPRG6HMSGIVP57ALXKAHMVANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.>

I used community.py but I still got same result as mentioned in the previous

i configured cape on my machine . but it doesnot show much details like your sandbox analysis . It just detect single signature but your sandbox detect multiple signatures and my cape sanbox doesnot [image: Screenshot from 2023-06-06 12-47-30] https://user-images.githubusercontent.com/113498180/243911688-cc31708d-1cfd-493b-9f44-ad9057fa7074.png provide me a single summary detail I'm attaching screenshot of my analysis report . I cannot share pdf report because i'm also facing problem in generating pdf report file not found.

should i share my configuration files with you?

[image: Screenshot from 2023-06-06 12-43-19] https://user-images.githubusercontent.com/113498180/243911717-0848ef08-5fa4-4816-acd9-07e6dd8766b7.png

[image: Screenshot from 2023-06-06 12-42-50] https://user-images.githubusercontent.com/113498180/243911719-4cef2a0f-a5e7-4390-b882-969b6c0fd969.png

now I used community.py but I still got same result as mentioned above

— Reply to this email directly, view it on GitHub https://github.com/kevoreilly/CAPEv2/issues/1506#issuecomment-1579974019, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOFH34PIW6JILYB2FNOLALXKAM2RANCNFSM6AAAAAAXWXZHVA . You are receiving this because you modified the open/close state.Message ID: @.***>

musman12362 commented 1 year ago

hey i hope you are fine. I need little information please guide me. Q1. which technique(methods/modules/3rd party integration) do cape sandbox use in dynamic analysis? Q2. how do cape sandbox do static analysis ? which data base and techniques do it use for static malware analysis ? i was searching for ans to these question and I already read documentation but I didnot understand and finds a proper answer

i will be very thankful to you if you can give answer to these question.

doomedraven commented 1 year ago

go over the configs they have most self explanatory and read the pyproject.toml to see dependencies. you will get almost everything about Q2

kevoreilly commented 1 year ago

Q1: capemon: https://github.com/kevoreilly/capemon

musman12362 commented 1 year ago

go over the configs they have most self explanatory and read the pyproject.toml to see dependencies. you will get almost everything about Q2

Thank you I'll check it

musman12362 commented 1 year ago

Q1: capemon: https://github.com/kevoreilly/capemon

Thanks for your supports

musman12362 commented 1 year ago

Capture

hey i hope you are fine actually I want to generate bin graph and I also enabled it in reporting.conf but when i click on the give bingraph button it just simple reload the page and doesn't generate it what can be the reason

reporting.conf

https://github.com/geekscrapy/binGraph requires -> apt-get install python-tk [bingraph] enabled = yes on_demand = yes binary = yes geenrate bingraphs for cape/procdumps cape = yes procdump = yes

process log