OpenLLMAI / OpenRLHF

An Easy-to-use, Scalable and High-performance RLHF Framework (70B+ PPO Full Tuning & Iterative DPO & LoRA & Mixtral)
https://openrlhf.readthedocs.io/
Apache License 2.0
1.71k stars 160 forks source link

Will 2 x GPU setups be supported #307

Open llmlocal opened 1 month ago

llmlocal commented 1 month ago

I appreciate the team's hard work and understand the design decision to separate the 4 models onto separate GPU's. For small lab experiments is it possible to leverage a 2 x 4090 configuration?

hijkzzz commented 1 month ago

You could try QLora + 8B models