Open thanhnguyentung95 opened 3 hours ago
@thanhnguyentung95 Yes, we can upgrade vLLM. I'll take a look at this.
Hi @thanhnguyentung95 You can try upgrading vLLM in your local environment and continue development first. There are several indirect dependencies (like PyTorch, transformers), so I'll need some extra time to test the upgrade thoroughly to ensure it doesn't break any existing Aria functionality.
Due to the following code in vLLM 0.6.2, we cannot serve LoRA adapters on a per-request basis, as it does not support LoRA and Multimodal simultaneously:
Do you have a plan to upgrade the vLLM version for Aria?