EleutherAI / gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
https://www.eleuther.ai/
Apache License 2.0
6.95k stars 1.02k forks source link

Add KTO training #1244

Closed dmahan93 closed 2 months ago

dmahan93 commented 4 months ago

Draft because any comments/requests in https://github.com/EleutherAI/gpt-neox/pull/1242 will likely apply here