SCRT-HQ / VaporShell

A PowerShell module for building, packaging and deploying AWS CloudFormation templates
https://vaporshell.io
Apache License 2.0
48 stars 9 forks source link

Add-VSLambdaFunctionCode issue #30

Closed RegEM closed 6 years ago

RegEM commented 6 years ago

Hi Nate, hope things are going good!

I am having an issue with vsl or Add-VSLambdaFunctionCode. I get the following error:

VERBOSE: Getting TemplateBody from TemplateFile path
VERBOSE: Checking 'AWS::Lambda::Function' resource --- property 'Code'
vsl : Method invocation failed because [System.Management.Automation.PSCustomObject] **does not contain a method named
'TrimStart**'.
At line:1 char:1
+ vsl vaporize ...`
I am generating the 'code' like this:
`$the_template = Initialize-VaporShell -Description "the template"

$assumetheRole = `
'{
    "Version":"2012-10-17",
    "Statement":[{
        "Effect": "Allow",
        "Principal": {
            "Service": "iot.amazonaws.com"
        },
        "Action": "sts:AssumeRole"
    }]
}' `

$funcCode = Add-VSLambdaFunctionCode -S3Bucket (Add-FnRef "the_deployment") -S3Key "myfunction.zip" `

$the_template.AddResource( `
   ( New-VSIAMRole -LogicalId "testRole" -AssumeRolePolicyDocument $assumetheRole ), `
   ( New-VSLambdaFunction -LogicalId "myfunction" `
                          -FunctionName "myfunction" `
                          -Code $funcCode `
                          -Handler "handler.auth" `
                          -Runtime "nodejs6.10" `
                          -Role (Add-FnRef "testRole") `
   ) `
) `

$TemplateFile = ".\the-template.yml"
$the_template.ToYAML($TemplateFile)
vsl vaporize --tf $TemplateFile --sn the_deployment --caps iam,named --v --f --w

I tried passing in raw json for the 'code', but got the same error.

Hope you have a minute to have a look.

RegEM commented 6 years ago

I think I tracked it down to Invoke-VSPackage. It is not handling the 'code' parameter from a lambda function declaration properly, I believe. The $Resource.Properties.$propName with the following should be handled.

`S3Bucket S3Key


@{Ref=thedeployment} myfuntion.zip`

I could get it to generate a different error by assigning an "s3://..." string to 'code', but as far as I can see it should handle the case above.

scrthq commented 6 years ago

Apologies on the lag in response to this!!! I've been tied up with moving into a new house ☹️/😁... if you have a fix, please send over a PR with it added so I can check it out! Otherwise, I'll get on it as soon as I'm back in action!

On Sat, Apr 21, 2018 at 9:58 PM RegEM notifications@github.com wrote:

I think I tracked it down to Invoke-VSPackage. It is not handling the 'code' parameter from a lambda function declaration properly, I believe. The $Resource.Properties.$propName with the following should be handled.

`S3Bucket S3Key

@{Ref=thedeployment} myfuntion.zip`

I could get it to generate a different error by assigning an "s3://..." string to 'code', but as far as I can see it should handle the case above.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/scrthq/VaporShell/issues/30#issuecomment-383351434, or mute the thread https://github.com/notifications/unsubscribe-auth/AMIo3ePT2rjEJ5lCJ5-hA24azrwW9LBEks5tq_HSgaJpZM4TeDNU .

-- Thank you for your time, Nate Ferrell (SCRT HQ)

RegEM commented 6 years ago

No worries. Congrats on the new digs. Hope the move goes smooth.

Unfortunately the powershell in your modules is several levels ahead of my understanding it.

I'm a bit surprised that section can handle all the different resource types. But then my knowledge of the aws spec is pretty weak too.

Definitely would pass along any fixes I could come up with.

RegEM commented 6 years ago

Well I managed to get my test template deployed. Far from an elegant solution though. Just added an extra block in the code to test for 'Code' Don't imagine it will help you too much, but maybe to see the changes I made will help ever so little.

Ran into plenty of my own declaration issues, so here too is my new PS to generate the template.

$env:AWS_PROFILE = "PS1" 
$the_template = Initialize-VaporShell -Description "the template"

$assumetheRole = `
'{
    "Version":"2012-10-17",
    "Statement":[{
        "Effect": "Allow",
        "Principal": {
            "Service": "lambda.amazonaws.com"
        },
        "Action": "sts:AssumeRole"
    }]
}' `

#$funcCode = Add-VSLambdaFunctionCode -S3Bucket 'thedeployment' -S3Key 'myfunction.zip' ` 

$the_template.AddResource( `
   ( New-VSIAMRole -LogicalId "testRole" -AssumeRolePolicyDocument $assumetheRole ), `
   ( New-VSLambdaFunction -LogicalId "myfunction" `
                          -FunctionName "myfunction" `
                          -Code (Add-VSLambdaFunctionCode -S3Bucket 'thedeployment' -S3Key myfunction.zip) `
                          -Handler "handler.auth" `
                          -Runtime "nodejs6.10" `
                          -Role (Add-FnGetAtt 'testRole' -AttributeName Arn) `
   ) `
) `

$TemplateFile = ".\the-template.yml"

$the_template.ToYAML($TemplateFile)

vsl vaporize --tf $TemplateFile --sn thedeployment --caps iam,named --v --f --w

change to 'Invoke-vspackage.ps1'

                    if ($propName -eq 'Code'){
                                        if (($Resource.Properties.$propName.S3Key -notlike "s3://*") -and ($Resource.Properties.$propName.S3Key -notlike "http*")) {
                                    $found = $true
                                    if (Test-Path ("$tempParent\$($Resource.Properties.$propName.S3Key.TrimStart('.'))")) {
                                        $filePath = (Resolve-Path "$tempParent\$($Resource.Properties.$propName.S3Key.TrimStart('.'))").Path
                                        Write-Verbose "File found in template directory at: $filePath"
                                    }
                                    elseif (Test-Path $Resource.Properties.$propName.S3Key) {
                                        $filePath = (Resolve-Path $Resource.Properties.$propName.S3Key).Path
                                        if ($filePath -like "$($pwd.Path)*") {
                                            Write-Verbose "File found in current working directory at: $filePath"
                                        }
                                        else {
                                            Write-Verbose "File found at: $filePath"
                                        }
                                    }
                                    else {
                                        $found = $false
                                    }
                                    if ($found) {
                                        $fileInfo = Get-Item $filePath
                                        if ($fileInfo.PSIsContainer) {
                                            if ($S3Prefix) {
                                                $key = "$($S3Prefix)/$($fileInfo.BaseName).zip"
                                            }
                                            else {
                                                $key = "$($fileInfo.BaseName).zip"
                                            }
                                            $outFile = Join-Path $fileInfo.Parent.FullName $key
                                            if (Test-Path $outFile) {
                                                Remove-Item $outFile -Force
                                            }
                                            [System.IO.Compression.Zipfile]::CreateFromDirectory($filePath,$outFile)
                                        }
                                        else {
                                            if ($S3Prefix) {
                                                $key = "$($S3Prefix)/$($fileInfo.Name)"
                                            }
                                            else {
                                                $key = "$($fileInfo.Name)"
                                            }
                                            $outFile = $filePath
                                        }
                                        if ($Force) {
                                            Write-Verbose "Uploading object!"
                                            $obj = New-VSS3Object -Key $key -FilePath $outFile @s3Params @prof -Verbose:$false
                                        }
                                        else {
                                            Write-Verbose "Checking if object exists in bucket"
                                            $existsMeta = Get-VSS3ObjectMetadata -BucketName $baseUrl -Key $key -ErrorAction SilentlyContinue -Verbose:$false
                                            if (!$existsMeta) {
                                                Write-Verbose "Object not found -- uploading!"
                                                $obj = New-VSS3Object -Key $key -FilePath $outFile @s3Params @prof -Verbose:$false
                                            }
                                            elseif ($existsMeta.ContentLength -eq (Get-Item $outFile).Length) {
                                                Write-Warning "Object '$key' already exists in bucket and is the same size. No action apparently necessary -- If this file needs to be reuploaded, re-run this command with the Force parameter included."
                                                return
                                            }
                                            else {
                                                Write-Warning "Object already exists at this location and Force parameter not used. No action taken to prevent accidental overwrites. -- If this object needs to be overwritten, re-run this command with the Force parameter included."
                                                return
                                            }
                                        }
                                        #$Resource.Properties.$propName = (Add-VSLambdaFunctionCode -S3Bucket "s3://$baseUrl/$Resource.Properties.$propName.S3Bucket" -S3Key $Resource.Properties.$propName.S3Key)
                                        # "s3://$baseUrl/$key"
                                        #S3Bucket already ref'd
                                    }
                                    else {
                                        Write-Warning "$propName value '$($Resource.Properties.$propName)' does not appear to be an S3 URL but is also not locatable in the current working directory or the direct of the template (if provided). Please specify the full path of the local $propName to upload to the S3 bucket '$baseUrl'"
                                    }
                    }
                    }
                    else {

                    if (($Resource.Properties.$propName -notlike "s3://*") -and ($Resource.Properties.$propName -notlike "http*")) {
...existing
RegEM commented 6 years ago

Sorry for the formatting. Don't know how to deal with it.

scrthq commented 6 years ago

Hey @RegEM - Thanks for the spot! It looks like it was failing when using the AWS::Lambda::Function resource type, which indicates that you either have the code uploaded to S3 (if using the S3Bucket and S3Key params with Add-VSLambdaFunctionCode) or are providing the code directly on the CloudFormation template (if using the ZipFile param with Add-VSLambdaFunctionCode instead). That being said, the entire block that's causing you issues can be skipped if $propName -eq 'Code', I believe.

Can you try updating line 128 to the following and testing your deployment to confirm it works? If so, I'll push out a hotfix for this to account for it.

if ($propName -and $Resource.Properties.$propName -and $propName -ne 'Code') {
RegEM commented 6 years ago

Yes that seems to work fine. Thanks. I tried two templates.

I do have issues with the code not being assigned properly in the lambda. As in, my handler is not found in the console when first test is ran. I then reupload with a third party utility (lambundaler) and the lambda then works. Will make sure I am not causing the issue, but it may take a while. Your code is uploading ok.

This could easily be unrelated, but I was starting to look at the artifact uploading your code does, as I thought it also wasn't updating the lambda like I would want. Will open a new issue if that also proves to be one.

fwiw, I was looking at this page for some guidance on the updating, https://aws.amazon.com/blogs/compute/new-deployment-options-for-aws-lambda/

RegEM commented 6 years ago

I see lambundaler uses lambda.createFunction. The zipfile should be the same one used with it, or with vsl.

scrthq commented 6 years ago

If you have your code local, you could use New-SAMFunction instead and create a Serverless CloudFormation template using AWS Serverless Application Model. That's mainly what the artifact uploading is for, to replicate AWS CLI's package command: https://docs.aws.amazon.com/cli/latest/reference/cloudformation/package.html

A quick example on it for reference: https://vaporshell.io/docs/examples#api-backend

Another example from the Readme of this repo showing it in use with a local code folder as the CodeURI parameter which triggers the upload process and rebuilds the template with the actual S3 paths: https://github.com/scrthq/VaporShell#examples

RegEM commented 6 years ago

Thanks for the suggestions / tips. I have used the Sam functions too. I was just trying to replicate my original serverless template and/or maybe I thought the codeuri needed to be from a bucket not local.

I am using New-SAMApi with swagger file too. It generates an extra stage and I don't know why, but I'll get to that in due time.

Please close this out when ready. Cheers, Richard

scrthq commented 6 years ago

Gotcha! And that totally makes sense, I definitely want to ensure that all existing templates are also able to be recreated with VaporShell; if they aren't, that's a bug to me and not something that should constitute a behavior change on your part 😄

Keeping this open until I can push the fix out to the PS Gallery!

scrthq commented 6 years ago

@RegEM - Considering you don't technically have to "package" any local artifacts with your template example and are looking to simply deploy your template, you can use vsl deploy.... instead of vsl vaporize...., i.e. vsl deploy --tf $TemplateFile --sn the_deployment --caps iam,named --v --f --w. This will skip the package phase, which is primarily for zipping and uploading local code artifacts and spitting out an updated template with S3 locations instead of local paths. Could you let me know if using vsl deploy.... works for your use case as well?


I don't think skipping the block via adding -and $propName -ne 'Code' is the end all solution, considering aws package itself is supposed to support AWS Lambda Function Code with local artifacts as well. I'd rather work on understanding how aws package is expected to work with that resource type and reach feature parity than exclude the resource type from being able to upload local artifacts personally.

Again, keeping this open as it is a bug IMO that should be fixed, but I need to figure out how to fix it appropriately without breaking expected functionality.

RegEM commented 6 years ago

Sounds good. I had a suspicion it required some work. That's why I was looking at the 'deployment' link I posted earlier. Will likely change to New-SamFunction for the time being.

Tried deploy as asked. Worked fine. Funny, I ran both my templates to the same stack with no errors. Previously I wasn't. Not that big of a surprise really, but still nice to see.

Failed the first time I ran it as there was no zip file in my bucket. As expected.

I see now my lambundaler doesn't quite work as I thought too. I tried setting it to load the zipfile specifically to my folder, but ran into an AWS bug. So need to revisit the uploading. I like the way it bundles, so think I will continue to use it for that. Trivial to upload anyways with Write-S3Object. Just need to make sure the code actually updates.

Was going to check my swagger file for the api, but I see it is a 'DefinitionUri" with the s3: location.

scrthq commented 6 years ago

vsl deploy/vsl vaporize use Change Sets to do stack deployments, so if the stack already exists, it updates the stack where changes are only instead of attempting to deploy a full new stack (which would typically error due to stack already existing). That's why you weren't getting errors even though you were deploying to the same stack name 😃 . If you open CloudFormation in the AWS Console and look at the stack, you should be able to see both change sets there as well

RegEM commented 6 years ago

Thanks again for the tips. I ran thru it again.

I see now that the changeset removed my function and a role from the first set. So not what I originally thought.

I see a 'DELETE_IN_PROGRESS' showed in the deploy output, for the function but not for the role. Might be nice if it was a different color, and/or some indication that a changeset was done. But sure helps, now that I know what to look for.

RegEM commented 6 years ago

I see the 'DELETE_IN_PROGRESS' is actually a different color. Yellow I believe.

Just didn't register on my color blind eyes. I use this utility called Visolve to help me with these things.

scrthq commented 6 years ago

@RegEM - yup! All of the "In Progress" events are typically yellow, failures red, and completions should be green if I remember correctly.

I noticed that you were using the --w with vsl; do you find that the event coloring in the VaporShell Watch-Stack window are hard to distinguish for you? I'd be interested in adding a configuration piece to change output coloring for Watch-Stack if it's something of value!

RegEM commented 6 years ago

Thanks for the offer, but probably not too high on the priority list. I actually struggle to configure the colors effectively when I have the opportunity. I've found visolve to be good, as I can just toggle the colors to an extreme, and then things jump out for me.

I see maybe 8% of men are colorblind in one form or another. Mine is fairly complicated. I used to say I'm not color blind, I'm just color ignorant. Ignorant as in indifferent.

For the current colors, I see the yellow & green to be pretty close to the same. The 'CREATE_COMPLETE' has a solid green background. All that green possibly over-saturates my cones, making it harder to see the yellow right next to it.

Usually, when possible I set the greens to be darker, and the yellows to be brighter. It's not perfect, but can help. The next guy would be different again. Often I struggle with reds & greens too. That pair is usually not an issue for me on the console though. Struggle to find strawberries, etc.. Run probably a few more red lights than some people.

I just love watching the deploy succeed.

I see there is some discussion to address this issue with the PS console. I also just ran into what may be quite a useful program for me: https://developer.paciellogroup.com/resources/contrastanalyser/. I was noticing how good the contrast for the labels are on github. Maybe they follow similar guidelines as suggested by the contrastanalyser.

Thanks again for the offer. Happy to test anything you come up with. Back to my project... :-)

RegEM commented 6 years ago

So I've converted to using New-SamFunction ok.

Struggled as the handler string was different than I expected. No error on the stack formation (probably isn't one available), but of course I couldn't see the code in my lambda console. Figured it out eventually.

Now just have to get a better handle on my permissions.