containers / ai-lab-recipes

Examples for building and running LLM services and applications locally with Podman
Apache License 2.0
103 stars 106 forks source link

Add environment variables for ilab container and increase shm size for vllm #722

Closed relyt0925 closed 1 month ago

relyt0925 commented 1 month ago

Include ILAB_GLOBAL_CONFIG, VLLM_LOGGING_LEVEL, and NCCL_DEBUG as environment variables when starting the ilab container. Also add shared memory size of 10G to enable vllm execution. Resolves: https://github.com/containers/ai-lab-recipes/issues/721

rhatdan commented 1 month ago

LGTM