aws-samples / devsecops-cicd

MIT No Attribution
84 stars 137 forks source link

LambdaFunSecurityHubImport, PipelineKMSKey failed #3

Open usuryadevara opened 3 years ago

usuryadevara commented 3 years ago

any help with this error and i have uploaded the lambda-functions/ folder to S3

I get this error Resource handler returned message: "Error occurred while GetObject. S3 Error Code: NoSuchKey. S3 Error Message: The specified key does not exist. (Service: Lambda, Status Code: 400, Request ID: d9cdc8c8-0555-40a4-8176-d39671a3de13, Extended Request ID: null)" (RequestToken: 81731847-62db-6927-59ba-1b63c17614ca, HandlerErrorCode: InvalidRequest) in LambdaFunSecurityHubImport

`

LambdaHandlerName import_findings_security_hub.lambda_handler -
LambdaPackageLoc bucket-devsecops-poc -
LambdaPackageS3Key lambda-functions/import_findings_security_hub.zip

`

The following resource(s) failed to create: [LambdaFunSecurityHubImport, TrailBucketPolicy, PipelineKMSKey, CloudWatchPipelineEventRule]. Rollback requested by user.

Any help here?

usuryadevara commented 3 years ago

@manepals can you please help here with this issue? We are trying to implement this in End to End from our blog.

Issue 1: LambdaHandlerName = ?, i gave import_findings_security_hub.lambda_handler is this valid? LambdaPackageLoc = S3bucketname where we have lambda-functions/ folder in the bucket with these two files import_findings_security_hub.py, securityhub.py LambdaPackageS3Key = lambda-functions/import_findings_security_hub.zip, is this file needs to be there? i believe its a file it will generate there under lambda-functions folder in bucket.

image

Issue 2: I am running OWZAP and sonar in container and exposed those URL's with loadbalancer docker pull owasp/zap2docker-stable docker run -d -u zap -p 8080:8080 -i owasp/zap2docker-stable zap.sh -daemon -host 0.0.0.0 -port 8080 -config api.addrs.addr.name=".*" -config api.addrs.addr.regex=true -config api.key="password123"

How can i test this API with CURL or something inside container?

Sonarcube running in docker with API generated for account admin. docker run -d name sonarqube -e SONAR_ES_BOOTSTRAP_CHECKS_DISABLE=true -p 9000:9000 sonarqube:latest admin/admin test: curl -u : http://localhost:9000/api/components/search_projects

Issue 3: Application URL for ApplicationURLForDASTScan: Can this URL be any application like Elasticbeanstalk DEV or Prod Environment URL?

Please help us here with this POC.

manepals commented 3 years ago

Hi, sorry for the delay in response. For the first question, Where it is failing on the lambdapackages3 location, did you create another key (folder) inside your S3 bucket with name "lambda-functions"? Looks like you are directly hosting "import_findings_security_hub.zip" under "bucket-devsecops-poc", without a folder "lambda-functions". If no folder, just adjust the LambdaPackageS3 without it.

One way you can access OWASP ZAP URL is have a proxy in front of it. You can use the same proxy for both OWASP an SonarQube for external access.

For DAST application URL, you can use any externally available URL. It doesn't matter dev or prod. But, you may want to avoid running these tests in a prod environment.

Also, to install sonarqube and owasp zap on an EC2 instance, you can use the CF template "https://github.com/aws-samples/devsecops-cicd/blob/main/workshop/templates/ec2-sonarqube-zap.yaml". I recently added this CF template and that will give you the SonarQube and ZAP URL endpoints as output.

hope this helps. good luck with your POC.

usuryadevara commented 3 years ago

Thanks @manepals

yeah i created an S3 bucket folder lambda-functions but i have these two files instead of .zip. import_findings_security_hub.py & securityhub.py. If i have to zip it are these the two files which needs to be inside import_findings_security_hub.zip?

I will try that CF template and provision zap and sonar as well.

manepals commented 3 years ago

All it needs is .zip file. I just kept them for reference. May be I should delete those .py files.

usuryadevara commented 3 years ago

@manepals , i was able to deploy the pipeline but i got failure in the DAST Analysis

here is what i have in my code repo

image

This was running for hours and eventually its failing, i also noticed 502 Bad Gateway on the Elasticbeanstalk URL and degraded after deploying the code. If i revert back the application its working from Beanstalk page and i can see Welcome page.

image
usuryadevara commented 3 years ago

@manepals , Sorry to reach you back and check with you on other the limitations or issues i am facing with Elasticbeanstalk PHP 8 application

I committed these files to the Code commit repo

image

And facing issues with SCA and SAST
image image

Any help would be appreciated, i spend hours working through both the ELasticBeanstalk applications (PHP and corretto)

manepals commented 3 years ago

What is the issue you are facing? Based on the above screen shot, you are receiving the findings for both SCA/SAST in SecurityHub, that tells me, your SCA, SAST analysis is succeeded and working as expected. Anything specific you are looking for? Also, we released a workshop based on this solution. https://devsecops-cicd.workshop.aws/en/. You can explore for additional instructions. Workshop based on Java app not PHP.

usuryadevara commented 3 years ago

I was able to deploy the code just fine, i also tear down the old one and followed this https://devsecops-cicd.workshop.aws/en/. after i initially commit the code to codecommit it stuck at DASTAnalysis now saying `

OWASP ZAP scan status is null

image

manepals commented 3 years ago

can you share the screen shot of parameters that you are passing to the pipeline?

usuryadevara commented 3 years ago

Here are the details @manepals
`[Container] 2021/08/20 17:31:35 Running command stat=50;

40 | while [ "$stat" != 100 ]; do 41 | stat=$(curl "$OwaspZapURL/JSON/ascan/view/status/?apikey=$OwaspZapApiKey&scanId=$scanid" | jq -r '.status'); 42 | echo "OWASP ZAP scan status is $stat" 43 | echo "OWASP Zap analysis status is in progress..."; 44 | sleep 5; 45 | done 46 | echo "OWASP Zap analysis status is completed..."; 47 |   48 | % Total % Received % Xferd Average Speed Time Time Time Current 49 | Dload Upload Total Spent Left Speed 50 |   51 | 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 52 | 100 52 100 52 0 0 5777 0 --:--:-- --:--:-- --:--:-- 5777 53 | OWASP ZAP scan status is null

` image

Application URL is same as dev Elastic beanstalk URL: http://***-env.eba-uhyvz2ph.us-east-1.elasticbeanstalk.com and same default password token from that Ec2 instance OWASP and sonarcube CFT.

also tried this from local computer and i get this $ scanid=$(curl "http://ec2-3-84-xxx.xxx.compute-1.amazonaws.com/JSON/ascan/action/scan/?apikey=workshopzapkey&recurse=true&inScopeOnly=&scanPolicyName=&method=&postData=&contextId=&url=http://test-env.eba-uhyvz2ph.us-east-1.elasticbeanstalk.com/" | jq -r '.scan') % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 67 100 67 0 0 514 0 --:--:-- --:--:-- --:--:-- 515 $ echo "scan id is " $scanid scan id is null

manepals commented 3 years ago

@usuryadevara , The Application URL looks good. The output of command tells me that OWASP zap is not taking the scan requests properly. That needs to be fixed. I am assuming you went through the OWASP zap spider scan and used the same API token that your are using in the curl command. You can also try to initiate the scan from the OWASP Zap URL directly and see if that is working as expected. Also ZAP has api's to see the findings, you can try those options and make sure Zap is working fine before triggering the pipeline DAST scan.

sean-digitalfix commented 3 years ago

All it needs is .zip file. I just kept them for reference. May be I should delete those .py files.

Hi @manepals I'm encountering the same problem. Perhaps you could help by answering a few questions about this LambdaPackageS3Key:

  1. What is LambdaPackageS3Key? (please don't say "S3 Key for Lambda package object"; I have no idea what that means)
  2. Is it a CMK or AWS-managed KMS key?
  3. It looks like lambda-functions/import_findings_security_hub.zip is required to be in the S3 bucket. What is supposed to be in lambda-functions/import_findings_security_hub.zip?
  4. Are we meant to zip up an encryption key and put it in that location?
  5. If so, what kind of key do we use, and what is it for?
  6. Why would you delete those .py files, since they appear to be the Lambda scripts?

Any clarity would be appreciated, thanks!

sean-digitalfix commented 3 years ago

OK, I think I have figured it out. Please correct me if I'm wrong!

The term "Key" in "LambdaPackageS3Key" refers to the location of the object in the S3 bucket that contains the Lambda function code. The placeholder lambda-functions/import_findings_security_hub.zip should be a .zip file containing the two .py files.

As of this writing, the file does not exist in the repository at lambda-functions/import_findings_security_hub.zip HOWEVER it does exist in workshop/lambda-functions/import_findings_security_hub.zip (and, as stated, contains the two .py files).

I had entirely missed the workshop directory.

Side note: Running the codepipeline-template.yaml failed on the next step PipelineKMSKey. But after discovering the workshop directory, I realised it also contains workshop\templates\devsecops-codepipeline.yaml which differs from the file codepipeline-template.yaml located in the repository's root, notably in PipelineKMSKey: (among other important statements). I will try that template next and see how it goes.

manepals commented 3 years ago

Hi @sean-digitalfix , thanks for the feedback. Yes, Key in the LambdaPackageS3Key refers to the S3 bucket, and it should contain the zip file.

There is a hands-on workshop based on this blog, which gives additional details and you could follow that if you prefer. Here is the link for the workshop. Feel free to use those templates.

https://devsecops-cicd.workshop.aws/en/