EleutherAI / gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
https://www.eleuther.ai/
Apache License 2.0
6.96k stars 1.02k forks source link

Make monitors consistent #1286

Closed Quentin-Anthony closed 2 months ago

Quentin-Anthony commented 2 months ago
  1. Separate wandb config from local_setup.yml into local_setup_wandb.yml. Make local_setup.yml tensorboard-only
  2. Update wording throughout README to reflect that we have more than wandb, and so that sections are more consistent.

@Lothiraldan -- FYI