SecurityRiskAdvisors / VECTR

VECTR is a tool that facilitates tracking of your red and blue team testing activities to measure detection and prevention capabilities across different attack scenarios
1.36k stars 161 forks source link

Test Case CreateTestCaseTemplateInput GraphQL mutation overwrite not functional #169

Closed ForrestYockey closed 2 years ago

ForrestYockey commented 2 years ago

Describe the bug Successful Test Case createTemplate GraphQL mutation requests with CreateTestCaseTemplateInput's "overwrite" set to True does not overwrite an existing Test Case template "with the same template ID", when specified.

To Reproduce Steps to reproduce the behavior:

  1. Make a GraphQL request to the Vectr GraphQL endpoint to create the initial Test Case template (with a nonstandard name, to avoid collision with an existing Test Case):
{
    "query": "mutation ($input: CreateTestCaseTemplateInput!) { testCase { createTemplate(input: $input) { testCases {id, name} } } }",
    "variables": {
        "input": {
            "overwrite": true,
            "testCaseTemplateData": [
                {
                    "attackAutomation": {
                        "attackVariables": [
                            {
                                "inputName": "dump_path",
                                "inputValue": "$ENV:temp",
                                "type": "STRING"
                            },
                            {
                                "inputName": "target_hive",
                                "inputValue": "SAM",
                                "type": "STRING"
                            },
                            {
                                "inputName": "dumped_hive",
                                "inputValue": "myhive",
                                "type": "STRING"
                            }
                        ],
                        "cleanupCommand": "$toremove = #{dump_path} + \"\\\" + '#{dumped_hive}'\nrm $toremove -ErrorAction Ignore",
                        "cleanupExecutor": "INLINE_POWERSHELL",
                        "command": "write-host \"\"\n$shadowlist = get-wmiobject win32_shadowcopy\n$volumenumbers = foreach($shadowcopy in $shadowlist){$shadowcopy.DeviceObject[-1]}\n$maxvolume = ($volumenumbers | Sort-Object -Descending)[0]\n$shadowpath = \"\\\\?\\GLOBALROOT\\Device\\HarddiskVolumeShadowCopy\" + $maxvolume + \"\\Windows\\System32\\config\\#{target_hive}\"\n$mydump = #{dump_path} + '\\' + '#{dumped_hive}'\n[System.IO.File]::Copy($shadowpath , $mydump)\n",
                        "executor": "INLINE_POWERSHELL"
                    },
                    "description": "Dump hives from volume shadow copies with System.IO.File\n",
                    "detectionSteps": [
                        "Hash dumpers open the Security Accounts Manager (SAM) on the local file system (<code>%SystemRoot%/system32/config/SAM</code>) or create a dump of the Registry SAM key to access stored account password hashes. Some hash dumpers will open the local file system as a device and parse to the SAM table to avoid file access defenses. Others will make an in-memory copy of the SAM table before reading hashes. Detection of compromised [Valid Accounts](https://attack.mitre.org/techniques/T1078) in-use by adversaries may help as well."
                    ],
                    "name": "T1003.002 - Dump Volume Shadow Copy Hives With System.Io.File x",
                    "operatorGuidance": "Supported Platforms:\n\twindows\n\nInput Arguments:\n  [Path] dump_path\n    Description: Path where the hive will be dumped\n    (default value: \"$ENV:temp\")\n\n  [String] target_hive\n    Description: Hive you wish to dump\n    (default value: \"SAM\")\n\n  [String] dumped_hive\n    Description: Name of the dumped hive\n    (default value: \"myhive\")\n\nExecutor:\n\tpowershell\n\nElevation Required:\n\tFalse\n\nCommand:\n\nwrite-host \"\"\n$shadowlist = get-wmiobject win32_shadowcopy\n$volumenumbers = foreach($shadowcopy in $shadowlist){$shadowcopy.DeviceObject[-1]}\n$maxvolume = ($volumenumbers | Sort-Object -Descending)[0]\n$shadowpath = \"\\\\?\\GLOBALROOT\\Device\\HarddiskVolumeShadowCopy\" + $maxvolume + \"\\Windows\\System32\\config\\#{target_hive}\"\n$mydump = #{dump_path} + '\\' + '#{dumped_hive}'\n[System.IO.File]::Copy($shadowpath , $mydump)\n\n\nCleanup:\n\n$toremove = #{dump_path} + \"\\\" + '#{dumped_hive}'\nrm $toremove -ErrorAction Ignore\n\n",
                    "organization": "ART",
                    "phase": "Credential Access",
                    "preventionSteps": [
                        "### User Training\nTrain users to be aware of access or manipulation attempts by an adversary to reduce the risk of successful spearphishing, social engineering, and other techniques that involve user interaction. Limit credential overlap across accounts and systems by training users and administrators not to use the same password for multiple accounts.",
                        "### Privileged Account Management\nManage the creation, modification, use, and permissions associated to privileged accounts, including SYSTEM and root. Do not put user or admin domain accounts in the local administrator groups across systems unless they are tightly controlled, as this is often equivalent to having a local administrator account with the same password on all systems. Follow best practices for design and administration of an enterprise network to limit privileged account use across administrative tiers.",
                        "### Password Policies\nSet and enforce secure password policies for accounts. Ensure that local administrator accounts have complex, unique passwords across all systems on the network.",
                        "### Operating System Configuration\nMake configuration changes related to the operating system or a common feature of the operating system that result in system hardening against techniques. Consider disabling or restricting NTLM.(Citation: Microsoft Disable NTLM Nov 2012)"
                    ],
                    "references": [
                        "https://github.com/redcanaryco/atomic-red-team/blob/master/atomics/T1003.002/T1003.002.md",
                        "https://attack.mitre.org/versions/v10/techniques/T1003/",
                        "https://attack.mitre.org/versions/v10/techniques/T1003/002/"
                    ],
                    "technique": "Security Account Manager - T1003.002",
                    "templateId": "f0000000-0000-0000-0000-000000000001"
                }
            ]
        }
    }
}
  1. Make a request with the same templateId (i.e., f0000000-0000-0000-0000-000000000001) and overwrite, with an updated "description" field (and a few other fields):
    {
    "query": "mutation ($input: CreateTestCaseTemplateInput!) { testCase { createTemplate(input: $input) { testCases {id, name} } } }",
    "variables": {
        "input": {
            "overwrite": true,
            "testCaseTemplateData": [
                {
                    "attackAutomation": {
                        "attackVariables": [
                            {
                                "inputName": "dump_path",
                                "inputValue": "$ENV:temp",
                                "type": "STRING"
                            },
                            {
                                "inputName": "target_hive",
                                "inputValue": "SAM",
                                "type": "STRING"
                            },
                            {
                                "inputName": "dumped_hive",
                                "inputValue": "myhive",
                                "type": "STRING"
                            }
                        ],
                        "cleanupCommand": "$toremove = #{dump_path} + \"\\\" + '#{dumped_hive}'\nrm $toremove -ErrorAction Ignore",
                        "cleanupExecutor": "INLINE_POWERSHELL",
                        "command": "write-host \"\"\n$shadowlist = get-wmiobject win32_shadowcopy\n$volumenumbers = foreach($shadowcopy in $shadowlist){$shadowcopy.DeviceObject[-1]}\n$maxvolume = ($volumenumbers | Sort-Object -Descending)[0]\n$shadowpath = \"\\\\?\\GLOBALROOT\\Device\\HarddiskVolumeShadowCopy\" + $maxvolume + \"\\Windows\\System32\\config\\#{target_hive}\"\n$mydump = #{dump_path} + '\\' + '#{dumped_hive}'\n[System.IO.File]::Copy($shadowpath , $mydump)\n",
                        "executor": "INLINE_POWERSHELL"
                    },
                    "description": "xxxxxxDump hives from volume shadow copies with System.IO.File\n",
                    "detectionSteps": [
                        "asdfasdfasdfasdfHash dumpers open the Security Accounts Manager (SAM) on the local file system (<code>%SystemRoot%/system32/config/SAM</code>) or create a dump of the Registry SAM key to access stored account password hashes. Some hash dumpers will open the local file system as a device and parse to the SAM table to avoid file access defenses. Others will make an in-memory copy of the SAM table before reading hashes. Detection of compromised [Valid Accounts](https://attack.mitre.org/techniques/T1078) in-use by adversaries may help as well."
                    ],
                    "name": "T1003.002 - Dump Volume Shadow Copy Hives With System.Io.File x",
                    "operatorGuidance": "zzzzzzzzzzzSupported Platforms:\n\twindows\n\nInput Arguments:\n  [Path] dump_path\n    Description: Path where the hive will be dumped\n    (default value: \"$ENV:temp\")\n\n  [String] target_hive\n    Description: Hive you wish to dump\n    (default value: \"SAM\")\n\n  [String] dumped_hive\n    Description: Name of the dumped hive\n    (default value: \"myhive\")\n\nExecutor:\n\tpowershell\n\nElevation Required:\n\tFalse\n\nCommand:\n\nwrite-host \"\"\n$shadowlist = get-wmiobject win32_shadowcopy\n$volumenumbers = foreach($shadowcopy in $shadowlist){$shadowcopy.DeviceObject[-1]}\n$maxvolume = ($volumenumbers | Sort-Object -Descending)[0]\n$shadowpath = \"\\\\?\\GLOBALROOT\\Device\\HarddiskVolumeShadowCopy\" + $maxvolume + \"\\Windows\\System32\\config\\#{target_hive}\"\n$mydump = #{dump_path} + '\\' + '#{dumped_hive}'\n[System.IO.File]::Copy($shadowpath , $mydump)\n\n\nCleanup:\n\n$toremove = #{dump_path} + \"\\\" + '#{dumped_hive}'\nrm $toremove -ErrorAction Ignore\n\n",
                    "organization": "ART",
                    "phase": "Credential Access",
                    "preventionSteps": [
                        "### User Training\nTrain users to be aware of access or manipulation attempts by an adversary to reduce the risk of successful spearphishing, social engineering, and other techniques that involve user interaction. Limit credential overlap across accounts and systems by training users and administrators not to use the same password for multiple accounts.",
                        "### Privileged Account Management\nManage the creation, modification, use, and permissions associated to privileged accounts, including SYSTEM and root. Do not put user or admin domain accounts in the local administrator groups across systems unless they are tightly controlled, as this is often equivalent to having a local administrator account with the same password on all systems. Follow best practices for design and administration of an enterprise network to limit privileged account use across administrative tiers.",
                        "### Password Policies\nSet and enforce secure password policies for accounts. Ensure that local administrator accounts have complex, unique passwords across all systems on the network.",
                        "### Operating System Configuration\nMake configuration changes related to the operating system or a common feature of the operating system that result in system hardening against techniques. Consider disabling or restricting NTLM.(Citation: Microsoft Disable NTLM Nov 2012)"
                    ],
                    "references": [
                        "https://github.com/redcanaryco/atomic-red-team/blob/master/atomics/T1003.002/T1003.002.md",
                        "https://attack.mitre.org/versions/v10/techniques/T1003/",
                        "https://attack.mitre.org/versions/v10/techniques/T1003/002/"
                    ],
                    "technique": "Security Account Manager - T1003.002",
                    "templateId": "f0000000-0000-0000-0000-000000000001"
                }
            ]
        }
    }
    }

Expected behavior After submitting the second, modified Test Case template request, the modified fields in the GraphQL request are expected to appear on the modified Test Case, within the Vectr web interface (e.g., the description should read "xxxxxxDump hives from..." after the overwrite request).

Screenshots Screenshot of initial Test Case template: image

Screenshot of Test Case template after sending the modified overwrite request and reloading the interface/restarting the server (nothing has changed): image

The response from the server, for the second overwrite request: image

Desktop (please complete the following information):

Additional context As with my other issue, my queries could be flawed, though the server probably should not hand back a 200 response for the overwrite if the overwrite is not actually happening or if the query is wrong. According to the documentation (https://docs.vectr.io/graphql/schema/createtestcasetemplateinput.doc.html):

# A flag to specify if the create template operation should fail on template ID
# collision or overwrite an existing Test Case template with the same template ID.
overwrite: [Boolean]
thebleucheese commented 2 years ago

Forrest, thank you for the detailed report! I'll take a look at both of these and get back to you here.

thebleucheese commented 2 years ago

This is a bug but there's a work-around. Try changing your technique to just the TID

"technique": "Security Account Manager - T1003.002", to "technique": "T1003.002",

It looks like there's a kind of validation and technique inference issue. I had to tweak and save the template in the UI after I created one using the first technique string with the name and TID. Then I was able to save over it using the UI and API.

Edit: found another issue, it also doesn't seem to overwrite unless you change the name. There's some complex behavior here to map out all the dependencies from names when creating data. We'll fix that as well.

ForrestYockey commented 2 years ago

I couldn't get the work-around to work, though I guess my underlying usecase complicates things. I am working on an automated way to import the remaining 900+ Atomic Red Team tests (and update existing/new ones) as fully populated Test Cases in Vectr; the example GraphQL queries come from a script I wrote, that populates the missing 900+ Test Cases. (just read your update; sounds great). Of course, I do not expect or want my/Red Canary's goals to influence SRA's/Vectr's development goals, though I think Red Canary's Community Engagement team and others would support collaboration on better Atomic Red Team integration, if it is within SRA's interests.

thebleucheese commented 2 years ago

@ForrestYockey Just a note, VECTR directly supports importing the index.yaml file from ART in the UI. Administration -> Import Data section. You can drag/drop the latest ART index from there and it will allow you to import what you want, including importing and updating previous imports of the data, adding automation variables, etc.

If there are any additions or changes to that process you'd like to see, we can try to address them. We haven't had a chance to update what's shipped with VECTR yet. We started on a database migration to address this, but it's complicated to keep historical mapping with existing data from prior to ART having GUIDs so it will likely be a while before we get to it.

Of course, you're welcome to use the API too, it just may be easier to use the UI.

ForrestYockey commented 2 years ago

Oh yeah, importing the index.yaml works great for getting the base information in there. There are other pieces of data (e.g., Detections, Preventions, Attacker Tools, other References, enhanced Operator Notes, threat-actor-based Tags, additional insights) that I wanted to automatically generate and add to that base data, to help out some of my compatriots that aren't so Red Team focused. No rush on anything! Migrating backend is a huge pain.

thebleucheese commented 2 years ago

Sounds great, I have a fix for this in testing right now. We'll soon have update() commands in place as well so you won't have to overwrite existing content.

thebleucheese commented 2 years ago

Fixed underlying bugs in 8.3.1