GoogleCloudPlatform / deploymentmanager-samples

Deployment Manager samples and templates.
Apache License 2.0
935 stars 716 forks source link

Cloud functions python template error uploading functions #370

Open dinigo opened 5 years ago

dinigo commented 5 years ago

When using any of the examples provided to deploy functions the python template inlines all the code (compressed and codified) in a single command as shown here

https://github.com/GoogleCloudPlatform/deploymentmanager-samples/blob/bc2979c9a55c1d65b2626248426a5d29f243d5b4/examples/v2/cloud_functions/python/cloud_function.py#L38

And also here

https://github.com/GoogleCloudPlatform/deploymentmanager-samples/blob/bc2979c9a55c1d65b2626248426a5d29f243d5b4/community/cloud-foundation/templates/cloud_function/upload.py#L79-L80

This works for the small three line functions provided, (less than 100 LOC main.py file) it produces the following error:

- code: RESOURCE_ERROR
  location: /deployments/my-deployment/resources/upload-function-code
  message: '{"ResourceType":"gcp-types/cloudbuild-v1:cloudbuild.projects.builds.create","ResourceErrorCode":"400","ResourceErrorMessage":{"code":400,"message":"invalid
    build: invalid .steps field: build step 0 arg 2 too long (max: 4000)","status":"INVALID_ARGUMENT","statusMessage":"Bad
    Request","requestPath":"https://cloudbuild.googleapis.com/v1/projects/my-project/builds","httpMethod":"POST"}}'

Which means that this cmd variable we defined above is longer than 4k characters.

Deploying extra infrastructure to copy files as suggested here https://github.com/GoogleCloudPlatform/deploymentmanager-samples/issues/40#issuecomment-339759328 produces a big overhead in development time and makes the deployment unnecessarily complex. You would have to write a:

  1. python template to deploy small functions (as mentioned above, big functions wont fit).
  2. cloud function to copy the files.
  3. function call resource to call the function with the inline text for each file (this can turn into a composite type or another template)
  4. cloudbuild pipeline resource to zip the files in the bucket
  5. cloud function resource, finally, to deploy the comprssed code.

On the other hand I could copy the files by downloading them from the repository itself as suggested here https://stackoverflow.com/a/49751362, but this will make the deployment "gitlab dependant". I would need a different template for each type of repository (and a Docker image to deploy the zip utility, another dependency).

resources:
  - name: my-build
    action: gcp-types/cloudbuild-v1:cloudbuild.projects.builds.create
    metadata:
      runtimePolicy:
      - CREATE
    properties:
      steps:
      - name: gcr.io/cloud-builders/git
        args: ['clone', 'git clone https://gitlab-ci-token:{{ properties ['GITLAB_CI_TOKEN'] }}@gitlab.com/myuser/myrepo']
      - name: gcr.io/{{ env['project'] }}/zip
        args: ['–r', 'function.zip', 'function_folder']
      - name: gcr.io/cloud-builders/gsutil
        args: ['cp',  'function.zip', gs://my-bucket/ ]
      timeout: 120s

Could there be an alternative for moving files with no extra template/config files?

vendablefall commented 5 years ago

+1 would be nice to be able to upload complex functions without having to craft my own build process :-)

dinigo commented 5 years ago

I know why this is designed like it is. Every deploy would be easy to reproduce without depending on third party code. You don't want to fail because someone's else's bug. The configuration should be as static as possible.

It's just not useful when the whole effort of the could is migrate towards serverless / microservice based solutions.

The simplest and coolest would be to provide an "archive_file" resource, just as terraform does.

(BTW, I find DM super useful. I just want to make it fit the standard needs as a whole and not depend on any other deployment/orchestration software)

dinigo commented 5 years ago

Right now I'm using gitlab, bundling the function folder , providing it as an artifact and coping it with a cloud build pipeline to a bucket

bohdanyurov-gl commented 5 years ago

Please note, that there are many other limitations, like text-only files, strict asci, etc. This is just a demo, you shouldn't use it as-is in your deployments.

ozydingo commented 4 years ago

To work around this, I split content into chunks:

(NOTE python3 expectation in range on the second line)

    chunk_length = 3500
    content_chunks = [content[ii:ii+chunk_length] for ii in range(0,len(content), chunk_length)]
    # use `>` first in case the file exists
    cmds = ["echo '%s' | base64 -d > /function/function.zip;" % (content_chunks[0].decode('ascii'))]
    # then use `>>` to append
    cmds += [
        "echo '%s' | base64 -d >> /function/function.zip;" % (chunk.decode('ascii'))
        for chunk in content_chunks[1:]
    ]

I then expanded the build steps by mapping cmds to an intermediate array:

    zip_steps = [
        {
            'name': 'ubuntu',
            'args': ['bash', '-c', cmd],
            'volumes': volumes,
        } for cmd in cmds
    ]
    build_step = {
        'name': 'upload-function-code',
        'action': 'gcp-types/cloudbuild-v1:cloudbuild.projects.builds.create',
        'metadata': {
            'runtimePolicy': ['UPDATE_ON_CHANGE']
        },
        'properties': {
            'steps': zip_steps + [{
                'name': 'gcr.io/cloud-builders/gsutil',
                'args': ['cp', '/function/function.zip', source_archive_url],
                'volumes': volumes
            }],
            'timeout':
            '120s'
        }
    }

This modification allowed the deploy to succeed.

dinigo commented 4 years ago

Thought my organization uses GCP extensively we don't rely on DM anymore. Terraform it is