RUCAIBox / LLMBox

A comprehensive library for implementing LLMs, including a unified training pipeline and comprehensive model evaluation.
MIT License
566 stars 74 forks source link

vllm_vllm_gpu_memory_utilization参数无效 #271

Closed xansar closed 1 month ago

xansar commented 1 month ago

设置不同的vllm_vllm_gpu_memory_utilization参数值时,vllm预占用的显存没有变化。 发现在utilization/utils/arguments.py#L371处,vllm_vllm_gpu_memory_utilization已经被写死为0.9,将此处注释掉后、并将vllm_vllm_gpu_memory_utilization的默认值由None修改为0.9发现可以通过vllm_vllm_gpu_memory_utilization进行控制预占用显存的大小。

huyiwen commented 1 month ago

Already fixed it.