togethercomputer / OpenChatKit

Apache License 2.0
9.01k stars 1.02k forks source link

Example script for continued pre-training? #162

Open jphme opened 1 year ago

jphme commented 1 year ago

First of all many thanks for the release of llama 2 7b 32k and your precious contributions!

It's appreciated that you provide example scripts for Finetuning; however the (for me) most interesting part, the continued pre-training mentioned in the blog post, is missing.

Would it be possible to provide that script as well? Many thanks in advance+ all the best!

lllyyyqqq commented 1 year ago

Hi, I am also interested in the pre-training part. A script would be the best. Also, a quick question, what's the sequence length setting in the pre-training? 4k or 32k?

csris commented 1 year ago

@zhangce @LorrinWWW, can you please comment?