containers / podlet

Generate Podman Quadlet files from a Podman command, compose file, or existing object
https://crates.io/crates/podlet
Mozilla Public License 2.0
422 stars 12 forks source link

Error when converting docker compose file #74

Closed alejones closed 5 months ago

alejones commented 5 months ago

Current Behavior:

I'm trying to convert a docker compose file provided by Label Studio ML backend into quadlets. I can get it to work when using podman-compose up, but it fails when running podlet compose docker-compose.yml.

Expected Behavior:

I'd like to get one or more .continer files that allow me to use systemd to start up everything in the compose file.

Steps To Reproduce:

I've downloaded and decompressed podlet from the latest release

You can find the docker compose file I'm running here: https://github.com/HumanSignal/label-studio-ml-backend/blob/master/label_studio_ml/examples/grounding_dino/docker-compose.yml

I've updated it so it looks like this:

version: "3.11"

services:
  grounding_dino:
    container_name: grounding_dino
    image: docker.io/heartexlabs/label-studio-ml-backend:grnddino-master
    build:
      context: .
      args:
        TEST_ENV: ${TEST_ENV}
# Increase the memory limit if you USE_SAM=true on CPU machine
#    deploy:
#      resources:
#        reservations:
#          memory: 16G
    environment:
      - MODEL_DIR=/data/models
      - WORKERS=2
      - THREADS=4
      - LOG_LEVEL=DEBUG

      # Add these variables if you want to access the images stored in Label Studio
      - LABEL_STUDIO_HOST=10.0.0.10
      - LABEL_STUDIO_ACCESS_TOKEN=myaccesstoken
      - LABEL_STUDIO_LOCAL_FILES_DOCUMENT_ROOT=${LABEL_STUDIO_LOCAL_FILES_DOCUMENT_ROOT:-/label-studio/files/}

      # use these if you want to use segment anything instead of bounding box predictions from input text prompts
      - USE_SAM=false  # if you want to automatically generate segment anything model predictions
      - USE_MOBILE_SAM=false # whether you want to use a more efficient, yet a bit less accurate, version of the segment anything model

      - BOX_THRESHOLD=0.30
      - TEXT_THRESHOLD=0.25
 # Uncomment the following lines if you want to use GPU
      - NVIDIA_VISIBLE_DEVICES=all
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 1
              capabilities: [gpu]
    ports:
      - "9090:9090"
    volumes:
      - "./data/ml-backend:/data"
      - "./prompt.txt:/app/prompt.txt"
      - ./dino.py:/app/dino.py
      - ~/network-drive/my/folders:/label-studio/files

In Ubuntu server 24.04 I'm running this command:

./podlet compose docker-compose.yml

And getting this result

Error: 
   0: File `docker-compose.yml` is not a valid compose file
   1: services.grounding_dino.deploy.resources.reservations: unknown field `devices`, expected `cpus` or `memory` at line 40 column 11

Location:
   src/cli/compose.rs:61

  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ BACKTRACE ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
                                ⋮ 2 frames hidden ⋮                               
   3: podlet::cli::compose::from_file_or_stdin::h5561bfcdb898137c
      at <unknown source file>:<unknown line>
   4: podlet::cli::Cli::try_into_files::hbc275cc23fd0dec9
      at <unknown source file>:<unknown line>
   5: podlet::cli::Cli::print_or_write_files::hbe42a64ece29dadb
      at <unknown source file>:<unknown line>
   6: podlet::main::h15b2c6373553fb80
      at <unknown source file>:<unknown line>
   7: std::sys_common::backtrace::__rust_begin_short_backtrace::hd50a620736e2010b
      at <unknown source file>:<unknown line>
   8: std::rt::lang_start::{{closure}}::he29e0212e49dea27
      at <unknown source file>:<unknown line>
   9: core::ops::function::impls::<impl core::ops::function::FnOnce<A> for &F>::call_once::hd95060ecd5e1ca24
      at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/core/src/ops/function.rs:284
  10: std::panicking::try::do_call::h6e8cf51db32a6e4b
      at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:552
  11: std::panicking::try::h3a52eefe24fe3c29
      at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:516
  12: std::panic::catch_unwind::h24c28c23c02c3841
      at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panic.rs:142
  13: std::rt::lang_start_internal::{{closure}}::h705d3c9cbc06ef47
      at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/rt.rs:148
  14: std::panicking::try::do_call::ha21f52ba13158470
      at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:552
  15: std::panicking::try::h5581346bf6aeb1f8
      at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panicking.rs:516
  16: std::panic::catch_unwind::h7919645a6b72e25b
      at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/panic.rs:142
  17: std::rt::lang_start_internal::h12de51168669836e
      at /rustc/82e1608dfa6e0b5569232559e3d385fea5a93112/library/std/src/rt.rs:148
  18: std::rt::lang_start::hca3f7ba2255c9d75
      at <unknown source file>:<unknown line>
  19: __libc_start_main<unknown>
      at <unknown source file>:<unknown line>
  20: _start<unknown>
      at <unknown source file>:<unknown line>

Run with COLORBT_SHOW_HIDDEN=1 environment variable to disable frame filtering.
Run with RUST_BACKTRACE=full to include source snippets.

Versions:

podman version: 4.9.3 Podlet verstion: 0.2.4 - 2024-01-30 OS: Ubuntu Server 24.04

Anything else:

I've used Podlet to create a service file from a podman run command. This is the first time I'm trying to convert a docker-compose file.

k9withabone commented 5 months ago

Thanks for the detailed write-up! This particular error is fixed by #73. However, you will still get an error as deploy is not supported by podlet. deploy is meant for use by Docker Swarm and other production deployments that use Compose. I believe docker compose up will silently ignore the deploy section of a service. I decided that podlet would be up front about what it doesn't support instead of silently ignoring things. That way the user can decide how to handle it, for example, by commenting out the offending section and possibly replacing it with something else that gets the behavior they want.

alejones commented 5 months ago

Thanks for the quick response! I'll try commenting out the deploy section.

Do you know if there is anything else I need to do to get the gpu working?

k9withabone commented 5 months ago

Do you know if there is anything else I need to do to get the gpu working?

I've never tried to use a GPU in a container before. I imagine you need to mount the device into the container, see podman run --device and the devices section of service compose-spec. You should ask for help from the people who created the image/software.

k9withabone commented 5 months ago

I just learned about podman run --gpus, added in Podman v5.0.0. So you could add PodmanArgs=--gpus all to the generated .container quadlet file.

alejones commented 5 months ago

Awesome, thank you for finding that!