https://github.com/EleutherAI/gpt-neox/issues/1305 Shows that you can set the bf16 config without setting the precision to bfloat16 and this causes an obscure crash. This should automatically set the precision when it's missing or throw crash with an assert if precision conflicts.
https://github.com/EleutherAI/gpt-neox/issues/1305 Shows that you can set the bf16 config without setting the precision to bfloat16 and this causes an obscure crash. This should automatically set the precision when it's missing or throw crash with an assert if precision conflicts.