allenai / open-instruct

Apache License 2.0
1.08k stars 140 forks source link

Non-determinism in evaluate.predict when using vllm #145

Closed jacob-morrison closed 3 days ago

jacob-morrison commented 2 months ago

Conversation in slack: https://allenai.slack.com/archives/C06GS4HAWJV/p1712773278326539

Similar issue: https://github.com/vllm-project/vllm/issues/966

We saw non-deterministic model responses when doing greedy decoding using evaluate/predict.py, and disabling vllm fixed this issue. Two potential fixes: upcast models to float32, and upgrade vllm to a later version