issues
search
jzhang38
/
TinyLlama
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Apache License 2.0
7.3k
stars
425
forks
source link
TPU Pretraining
#151
Closed
kathir-ks
closed
5 days ago
kathir-ks
commented
5 months ago
How to train the model using TPUs?
How to train the model using TPUs?