OpenLLMAI / OpenRLHF

An Easy-to-use, Scalable and High-performance RLHF Framework (70B+ PPO Full Tuning & Iterative DPO & LoRA & Mixtral)
https://openrlhf.readthedocs.io/
Apache License 2.0
1.71k stars 160 forks source link

Added GPU memory specs and clarifications, fixed typo. #298

Closed KT313 closed 1 month ago

KT313 commented 1 month ago

Fixed "NVIDIA A800" to "NVIDIA A100".
Added gpu memory clarification. I assumed you meant 80GB since this is mentioned in Readme.md line 34.

KT313 commented 1 month ago

actually i didn't know that A800 GPU exists so i re-corrected it, but i added 40GB mem info since it's not a well-known GPU. Since there is an 80GB version of the card available though, if you were using the 80GB version instead, please change it in the commit i made.
I assumed the A800 are 40GB since you trained the 70B model on 32 of them, whereas you mentioned it being trainable on 16xA100 (presumably 80GB), so this made the most sense.

hijkzzz commented 1 month ago

A800 is the custom version of A100 for China.