intrigueio / intrigue-core

Discover Your Attack Surface!
https://core.intrigue.io
Other
1.35k stars 262 forks source link

Amazon S3 Task Rework #354

Closed m-q-t closed 3 years ago

m-q-t commented 3 years ago

Hi team,

Please find in this PR a complete re-work of the S3 Tasks. The majority of the tasks are new while some older ones have been re-written to support the new changes made.

lib/entities/aws_s3_bucket.rb

lib/tasks/helpers/aws.rb

lib/tasks/enrich/aws_s3_bucket.rb

lib/tasks/aws_s3_find_listable_objects.rb

lib/tasks/aws_s3_bruteforce_objects.rb

lib/tasks/aws_s3_gather_buckets.rb

lib/tasks/aws_s3_put_file.rb

lib/tasks/aws_s3_bruteforce_buckets.rb

The following workflow has been updated to utilize the new tasks:

Finally, the following existing tasks which created S3 Entities were modified to support the new changes: Note: This was super seamless due to the extract_bucket_name_from_uri helper method.

Best regards, Maxim

m-q-t commented 3 years ago

A few screenshots:

image image image image image

jcran commented 3 years ago

re: breaking hosted service - check the hosted-only workflow files for any mention of these tasks. That the only interaction i'm aware of that could cause issues.

m-q-t commented 3 years ago

@shpendk

  1. I see you deleted the task search_wayback_machine.rb. Is that on purpose and if so, why?

Sorry that was an accident on my end. I accidentally started working on another set of tasks without realizing I was in the wrong branch. I'll add back the original search_waybackm_macine.rb task so it doesn't actually delete anything. Thanks for catching that.

  1. Why did you not include aws_s3_bruteforce_objects in the list of tasks for AwsS3Bucket entities in the workflow file?

The reason for this is because the aws_s3_find_listable_objects task calls the aws_s3_bruteforce_objects task at the end if any listable objects are found; passing in those objects. I figured this would probably be a better idea than having it call an additional time from the workflow using the wordlist of the "100 Common Objects". If you feel that it should also call it an additional time, please let me know and I'll change it.

re: breaking hosted service - check the hosted-only workflow files for any mention of these tasks. That the only interaction i'm aware of that could cause issues.

I'll go through the hosted-only workflows and update them as well however they are not in this repository so that will be a separate PR.

Thanks for reviewing.

shpendk commented 3 years ago

@m-q-t regarding running aws_s3_bruteforce_objects, yes lets run it for every AWS s3 bucket. If listable_objects doesn't find anything, we still want to bruteforce files.

I think after this final change we're good to merge. Really nice work man.

m-q-t commented 3 years ago

Appreciate the kind words @shpendk

The workflow has been updated.