aws / aws-extensions-for-dotnet-cli

Extensions to the dotnet CLI to simplify the process of building and publishing .NET Core applications to AWS services
Apache License 2.0
370 stars 87 forks source link

3.1.0 of Amazon.Lambda.Tools with package-ci no longer publishes build output to S3 folder #41

Closed ThomasJaeger closed 5 years ago

ThomasJaeger commented 5 years ago

I'm trying to get my lambda function to be pubslihed after a successful build using the latest Amazon.Lambda.Tools v3.1.0.

Running command dotnet lambda package-ci --s3-bucket altosignal-lambda-functions --template pipeline.yml --output-template serverless.yaml locally or in my buildspec.yml file (for CodeBuild) only produces the serverless.yaml file which does not have an updated S3Key and stays the same as the original S3Key.

The S3 folder altosignal-lambda-functions is empty and I would have assumed that package-ci would enter the publish command. Instead, all I get is this:

 Processing CloudFormation resource lamCreateAccountFromApi
 Writing updated template: serverless.yaml

So, it never published anything to the S3 folder altosignal-lambda-functions. My lambda CloudFormation script looks like this:

lamCreateAccountFromApi:
    Type: AWS::Lambda::Function
    Properties:
        FunctionName: !Ref RepoName
        Handler: !Sub ${RepoName}::${RepoName}.Function::FunctionHandler
        Role:
            Fn::ImportValue:
                !Sub ${Env}-${AWS::Region}-global-iamprocesscommandsfromapigateway-arn
        Code:
            S3Bucket: altosignal-lambda-functions
            # S3Key: !Sub ${RepoName}.zip
            S3Key: CreateAccountFromApi.zip
            # ZipFile: CreateAccountFromApi.zip
        Runtime: dotnetcore2.1
        MemorySize: 128
        Timeout: 30

When I comment out S3Bucket and S3Key and then use the ZipFile above, package-ci does enter into the publish mode and uploads the build into the S3 folder. However, this does not work with CloudFormation changeset detection and CloudFormation stops the CodePipeline with error:

 Exactly one of S3Bucket and ZipFile must be specified

Based on the CloudFormation documenation here, you can not specify only a ZipFile for dotnetcore2.1. So, you must specify an S3Bucket and S3Key for dotnetcore2.1 projects. Also, based on CloudFormation documenation here, the Code property is required.

As it stands, I can not publish any more using package-ci. Please help!

ThomasJaeger commented 5 years ago

I can't be the only one experiencing this, can I? Any help would be appreciated.

normj commented 5 years ago

Taking a look

normj commented 5 years ago

The code decided not to do package up and send code to S3 if there is a bucket already specified for the Lambda function. The reasoning being it is assuming you want to create a Lambda function from already uploaded code.

If your zip file is local and you want it uploaded to S3 then only set the S3Key property to the local path and leave S3Bucket blank or don't have the field. Then when this tool runs it will see this file is local, upload it and then update the S3Bucket and S3Key properties appropriately.

I have never heard of using a ZipFIle property and is something this tool would ignore. Is this something you read about in another tool?

I'm working on a blog post to better explain the new 3.1.0 feature to go out with the a new version of the VS Toolkit that will have this new code.

ThomasJaeger commented 5 years ago

Thank you for looking into this. I'm trying to make this work with CodePipeline and so far, it's not possible.

My CodePipeline consists of Source, Build, and Deploy. So, for Source, it's CodeCommit, for Build it's CodeBuild, and for Deploy it's CreateChangeSet and ExecuteChangeSet. Since CodeBuild is successful even though Amazon.Lambda.Tools never created the updated and compiled Lambda zip file artifact, CodePipeline continues with CreateChangeSet. Of course, that changeset has no changes in it since the serverless.yml file still has the old S3Key in it.

As I mentioned in my post, you can not leave the S3Bucket empty, this will fail the CloudFormation changeset creation. CloudFormation will error out because it requires that both S3Bucket and S3Key must be set. Keep in mind that this is all executed during the CodePipeline flow. Nothing local here on my dev machine.

ZipFile is part of CloudFormation's Code property. The CloudFormation documention mentions ZipFile.

normj commented 5 years ago

In the input template S3Bucket should be empty and the template written out my Amazon.Lambda.Tools will have S3Bucket filled in. The template that Amazon.Lambda.Tools writes is the one one you send to CloudFormation.

Looking at your command and the name of your files I wonder if you have your parameters backwards. I assume serverless.yaml is the file you wrote and pipeline.yml is the one you want to send to code down the pipeline. If that is the case then you are using them for the wrong parameters. serverless.yaml should be for --template and pipeline.yml should be for --output-template.

ThomasJaeger commented 5 years ago

pipeline.yml is the file I wrote and serverless.yml is t he output template generated by Amazon.Lambda.Tools.

I will remove the S3Bucket in pipeline.yml. This file is part of the lambda project and committed to CodeCommit. I will go ahead and try this right now. Will report back. Thank you for the tip.

ThomasJaeger commented 5 years ago

Unfortunatley, it still does not work. Amazon.Lambda.Tools exists with error code 255 and CodeBuild fails. I tried both just an empty S3Bucket field and also commenting out the S3Bucket line in the Code property.

I'm getting the following error now:

Initiate packaging of CreateAccountFromApi.zip for resource lamCreateAccountFromApi 
Directory that the field lamCreateAccountFromApi/Code is pointing doesn't exist 
[Container] 2018/10/26 17:14:45 Command did not exit successfully dotnet lambda package-ci --s3-bucket altosignal-lambda-functions --template pipeline.yml --output-template serverless.yaml exit status 255 
[Container] 2018/10/26 17:14:45 Phase complete: BUILD Success: false 
[Container] 2018/10/26 17:14:45 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: dotnet lambda package-ci --s3-bucket altosignal-lambda-functions --template pipeline.yml --output-template serverless.yaml. Reason: exit status 255 

pipeline.yml included the following (among other resources):

    lamCreateAccountFromApi:
        Type: AWS::Lambda::Function
        Properties:
            FunctionName: CreateAccountFromApi
            Handler: !Sub ${RepoName}::${RepoName}.Function::FunctionHandler
            Role:
                Fn::ImportValue:
                    !Sub ${Env}-${AWS::Region}-global-iamprocesscommandsfromapigateway-arn
            Code:
                # S3Bucket: altosignal-lambda-functions
                S3Bucket:
                S3Key: CreateAccountFromApi.zip
            Runtime: dotnetcore2.1
            MemorySize: 128
            Timeout: 30

Here is my buildspec.yml file, just in case:

# Used by CodeBuild

version: 0.2

phases:
    install:
        commands:
            # - pip install --upgrade awscli
            - export PATH="$PATH:/root/.dotnet/tools"
            - dotnet tool install -g Amazon.Lambda.Tools
            # - dotnet tool update -g Amazon.Lambda.Tools   # Update the tools to the latest verion first
    pre_build:
        commands:
            - dotnet restore
    build:
        commands:
            - cd CreateAccountFromApi
            - dotnet lambda package-ci --s3-bucket altosignal-lambda-functions --template pipeline.yml --output-template serverless.yaml

artifacts:
    discard-paths: yes # zip files won't have any folders included
    files:
        # - 'CreateAccountFromApi/bin/Release/netcoreapp2.1/publish/*'
        ./CreateAccountFromApi/serverless.yaml
    name: CreateAccountFromApi.zip
normj commented 5 years ago

That error is complaining of the tools inability to find CreateAccountFromApi.zip. Does that file exist in the same directory as pipeline.yml? If you are specifying a zip file then the assumption is you have already created this zip file before running Amazon.Lambda.Tools and you just want the tool to upload it. If you are expecting the tool to build the current directory for the this Lambda function then leave S3Key blank.

ThomasJaeger commented 5 years ago

Thank you!!!

It finally works. I got a fully working CodePipeline working, very cool. I should create a course and teach others, there are so many things one has to watch out for to make this work but once it's working, it's super cool.

Thank you again for helping and all your efforts.

normj commented 5 years ago

No, problem. Can I ask what the final solution was?

ThomasJaeger commented 5 years ago

Sorry, I got so excited! :-)

I made the S3Key empty as well. So both, S3Bucket and S3Key are empty.

I will still have to figure out to divide the pipeline.yml file so that it only contains the AWS::Lambda::Function lambda portion. This file also contains AWS::ApiGateway::Resource and other resources pertaining to this single lambda. I already have a seperate infrastructure.yml CloudFormation file that exports many shared resources like IAM roles etc.

Problem is when I create the stack with CloudFormation for this single lambda function, I no longer have the S3Bucket and S3Key filled. I may end up using two seperate yml files, one for the stack and one for lambda function updates.