First of all many thanks for the release of llama 2 7b 32k and your precious contributions!
It's appreciated that you provide example scripts for Finetuning; however the (for me) most interesting part, the continued pre-training mentioned in the blog post, is missing.
Would it be possible to provide that script as well? Many thanks in advance+ all the best!
Hi, I am also interested in the pre-training part. A script would be the best. Also, a quick question, what's the sequence length setting in the pre-training? 4k or 32k?
First of all many thanks for the release of llama 2 7b 32k and your precious contributions!
It's appreciated that you provide example scripts for Finetuning; however the (for me) most interesting part, the continued pre-training mentioned in the blog post, is missing.
Would it be possible to provide that script as well? Many thanks in advance+ all the best!