scaleway / serverless-scaleway-functions

Plugin for Serverless Framework to allow users to deploy their serverless applications on Scaleway Functions
MIT License
78 stars 25 forks source link

Error while deploying container #94

Open milanjandric opened 2 years ago

milanjandric commented 2 years ago

I'm trying to run my first container. I have tried first manually to do so by pushing the image and creating Container using the web console. After I try to deploy I get only "Container failed with: .' when hovering over the red circle.

Then I tried to deploy using serverless and by using just your container example. serverless create --path my-func --template-url https://github.com/scaleway/serverless-scaleway-functions/tree/master/examples/container

service: my-func
configValidationMode: off
useDotenv: true

provider:
  name: scaleway
  # Global Environment variables - used in every functions
  env:
    test: test
  # the path to the credentials file needs to be absolute
  scwToken: ${env:SCW_SECRET_KEY}
  scwProject: ${env:SCW_DEFAULT_PROJECT_ID}
  scwRegion: ${env:SCW_REGION}

plugins:
  - serverless-scaleway-functions

package:
  patterns:
    - '!node_modules/**'
    - '!.gitignore'
    - '!.git/**'

custom:
  containers:
    first:
      directory: my-container
      minScale: 1
      memoryLimit: 256
      maxScale: 2
      port: 8080
      # Local environment variables - used only in given function
      env:
        local: local`

And I do get the same error but in a different format

serverless deploy
Using credentials from system environment
Using credentials from system environment
Using credentials from system environment
Using credentials from system environment
Using credentials from system environment
Using credentials from system environment
Creating container first...
Building and pushing container first to: rg.fr-par.scw.cloud/funcscwmyfuncjqb8s2em/first:latest ...
Deploying Containers...
Waiting for container deployments, this may take multiple minutes...
Environment: darwin, node 18.1.0, framework 3.19.0, plugin 6.2.2, SDK 4.3.2
Docs:        docs.serverless.com
Support:     forum.serverless.com
Bugs:        github.com/serverless/serverless/issues

Error:
Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Error: Container failed with: .
    at manageError (/Users/milan/Documents/Projects/pitchflow/test-serverless/my-func/node_modules/serverless-scaleway-functions/shared/api/utils.js:20:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

What am I doing wrong or there is some problem on your side?

Also, strange thing is that when I look at images in Container Registry, all images have the size of 0bytes.

thomas-tacquet commented 2 years ago

Hello 👋

I inspected your image, you need to build for platform amd64 and your image is built for arm64 👍

thomas-tacquet commented 2 years ago

There is an issue with the Docker package used to build, it's currently not possible to define the target platform : https://github.com/apocas/dockerode/issues/648

For now I recommend you to build and push your image for Container Registry using official Docker client until we publish a fix for this issue.

milanjandric commented 2 years ago

Thanks @thomas-tacquet I have managed to make a build using the default docker client and push successfully :) Tnx

Maybe off topic but I just started using scaleway so I have one more question.

So my idea was to have a service deployed that would do some long-running processes (encoding a video file using ffmpeg). It will not be used very often so this was a perfect use case as it can automatically scale. But it seams that all of them are limited to 900s max. Is there an options to increase this time limit?

thomas-tacquet commented 2 years ago

900s may be not enough for your use case but what I suggest to you is to use serverless Containers to orchestrate your encoding workflow.

You can for example fetch your media using a container, store it into a S3 bucket, then create instances on the fly (using the API) to make them compute faster your media files (if for example you need to encore 8k video files you can create an instance with high end encoding capacity).

milanjandric commented 2 years ago

Thanks, @thomas-tacquet yes that seems like a better approach. In AWS infrastructure, I have been solving similar issues with lambda + batch operations.

These instances that you are referring to are https://console.scaleway.com/instance/servers ? Thay can't use docker images from the registry directly. It would have to be manually setup os. Install docker, load docker image run image... etc... And then I save it as image.

Then over the API I can create instances using this image.

Is this your idea? Do you have some example code for this? How would I stop the instance... Can instance stop it self? :)