Open dinigo opened 5 years ago
+1 would be nice to be able to upload complex functions without having to craft my own build process :-)
I know why this is designed like it is. Every deploy would be easy to reproduce without depending on third party code. You don't want to fail because someone's else's bug. The configuration should be as static as possible.
It's just not useful when the whole effort of the could is migrate towards serverless / microservice based solutions.
The simplest and coolest would be to provide an "archive_file" resource, just as terraform does.
(BTW, I find DM super useful. I just want to make it fit the standard needs as a whole and not depend on any other deployment/orchestration software)
Right now I'm using gitlab, bundling the function folder , providing it as an artifact and coping it with a cloud build pipeline to a bucket
Please note, that there are many other limitations, like text-only files, strict asci, etc. This is just a demo, you shouldn't use it as-is in your deployments.
To work around this, I split content
into chunks:
(NOTE python3 expectation in range
on the second line)
chunk_length = 3500
content_chunks = [content[ii:ii+chunk_length] for ii in range(0,len(content), chunk_length)]
# use `>` first in case the file exists
cmds = ["echo '%s' | base64 -d > /function/function.zip;" % (content_chunks[0].decode('ascii'))]
# then use `>>` to append
cmds += [
"echo '%s' | base64 -d >> /function/function.zip;" % (chunk.decode('ascii'))
for chunk in content_chunks[1:]
]
I then expanded the build steps by mapping cmds
to an intermediate array:
zip_steps = [
{
'name': 'ubuntu',
'args': ['bash', '-c', cmd],
'volumes': volumes,
} for cmd in cmds
]
build_step = {
'name': 'upload-function-code',
'action': 'gcp-types/cloudbuild-v1:cloudbuild.projects.builds.create',
'metadata': {
'runtimePolicy': ['UPDATE_ON_CHANGE']
},
'properties': {
'steps': zip_steps + [{
'name': 'gcr.io/cloud-builders/gsutil',
'args': ['cp', '/function/function.zip', source_archive_url],
'volumes': volumes
}],
'timeout':
'120s'
}
}
This modification allowed the deploy to succeed.
Thought my organization uses GCP extensively we don't rely on DM anymore. Terraform it is
When using any of the examples provided to deploy functions the python template inlines all the code (compressed and codified) in a single command as shown here
https://github.com/GoogleCloudPlatform/deploymentmanager-samples/blob/bc2979c9a55c1d65b2626248426a5d29f243d5b4/examples/v2/cloud_functions/python/cloud_function.py#L38
And also here
https://github.com/GoogleCloudPlatform/deploymentmanager-samples/blob/bc2979c9a55c1d65b2626248426a5d29f243d5b4/community/cloud-foundation/templates/cloud_function/upload.py#L79-L80
This works for the small three line functions provided, (less than 100 LOC main.py file) it produces the following error:
Which means that this
cmd
variable we defined above is longer than 4k characters.Deploying extra infrastructure to copy files as suggested here https://github.com/GoogleCloudPlatform/deploymentmanager-samples/issues/40#issuecomment-339759328 produces a big overhead in development time and makes the deployment unnecessarily complex. You would have to write a:
On the other hand I could copy the files by downloading them from the repository itself as suggested here https://stackoverflow.com/a/49751362, but this will make the deployment "gitlab dependant". I would need a different template for each type of repository (and a Docker image to deploy the zip utility, another dependency).
Could there be an alternative for moving files with no extra template/config files?