ymcui / Chinese-LLaMA-Alpaca-2

中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
Apache License 2.0
7.04k stars 581 forks source link

Questions about Secondary Pretraining strategy #425

Closed staticpunch closed 9 months ago

staticpunch commented 9 months ago

Check before submitting issues

Type of Issue

Model training and fine-tuning

Base Model

Others

Operating System

Linux

Describe your issue in detail

I'm planning to do secondary pretraining Llama-2-7b on my language with a node of 8x H100 80GB. So given that I'm given enough time and resource, do you recommend:

Dependencies (must be provided for code-related issues)

# Please copy-and-paste your dependencies here.

Execution logs or screenshots

# Please copy-and-paste your logs here.
iMountTai commented 9 months ago
staticpunch commented 9 months ago

Got it, thank you.