issues
search
OpenLLMAI
/
OpenRLHF
An Easy-to-use, Scalable and High-performance RLHF Framework (70B+ PPO Full Tuning & Iterative DPO & LoRA & Mixtral)
https://openrlhf.readthedocs.io/
Apache License 2.0
1.73k
stars
164
forks
source link
fix: adjust vllm monkey patch for vllm>=0.2.7
#215
Closed
wuxibin89
closed
4 months ago
wuxibin89
commented
4 months ago
All tests passed
vllm==0.2.3: vllm_tensor_parallel_size=1/vllm_tensor_parallel_size=2
vllm==0.3.1: vllm_tensor_parallel_size=1/vllm_tensor_parallel_size=2
All tests passed