jzhang38 / TinyLlama

The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Apache License 2.0
7.31k stars 426 forks source link

How can we enable continuous learning with the Tiny Llama model ? #113

Open TapendraBaduwal opened 6 months ago

TapendraBaduwal commented 6 months ago

If we merge adapters then this work for continuous learning ? I expect the new train model to provide output for previously trained data as well.