Closed jueming0312 closed 1 month ago
[rank0]: Error while creating shared memory segment /dev/shm/nccl-TO0hFk (size 9637888)
you don't have enough shm for the container. see https://docs.vllm.ai/en/latest/serving/deploying_with_docker.html .
Hello, I have a question on this issue. Will the engine still use shared memory even if my GPU memory is more than enough?
[rank0]: Error while creating shared memory segment /dev/shm/nccl-TO0hFk (size 9637888)
you don't have enough shm for the container. see https://docs.vllm.ai/en/latest/serving/deploying_with_docker.html .
shared memory is commonly used for inter process communication, it is irrelevent with your GPU memory.
Your current environment
How would you like to use vllm
I'm running the vllm image in Kubernetes, and this error message appears when loading the internlm/internlm2_5-7b-chat model.