Closed THINK2TRY closed 4 months ago
I suggest you build a new container image including vLLM and torch-2.1.2~ Contributions to the openrlhf are also welcome
related issue: https://github.com/vllm-project/vllm/pull/2804
@hijkzzz Thanks for you reply. I will try the upgraded torch-2.2.0 in vllm. Btw, have you ever compared the performance between nemo-aligner and openrlhf? It seems that there is no inference optimization in Nemo-aligner but I'm still curious about the efficiency comparison.
@hijkzzz Thanks for you reply. I will try the upgraded torch-2.2.0 in vllm. Btw, have you ever compared the performance between nemo-aligner and openrlhf? It seems that there is no inference optimization in Nemo-aligner but I'm still curious about the efficiency comparison.
inference with nemo megatron is very slow
We added Dockerfile in https://github.com/OpenLLMAI/OpenRLHF/commit/8773cd137a8146e0312bf5543577deaa16cd14b7
Hi, thanks for your wonderful work! Is there any version of NGC recommended to run this framework? I tried to build vllm in NGC 23.12 / 24.01 but it reports that vllm is not compatible, as vllm requires torch-2.1.2. So is there any recommendation or solution to this problem?
Many thanks!