cckuailong / SuperAdapters

Finetune ALL LLMs with ALL Adapeters on ALL Platforms!
Apache License 2.0
306 stars 22 forks source link

LLAMA3.2 Compatibility #12

Open nyeeldzn opened 1 month ago

nyeeldzn commented 1 month ago

Is there any expectation for compatibility with the newly released LLAMA3.2? As a developer I could help with the project?

cckuailong commented 1 month ago

Thank you for that. Llama3.2 is similar to Llama3 or Llama3.1, so I support it in the newest code. You can try it by following these steps.

  1. Please upgrade your dependencies or you can just use the newest 'requirements.txt': accelerate==0.34.1 safetensors==0.4.3 tokenizers==0.20.0 transformers==4.45.1
  2. Fintune with llama3.2
    python finetune.py --model_type llama3 --data "data/train/alpaca_tiny.json" --model_path "LLMs/llama3/llama3.2-3B-Instruct" --adapter "lo
    ra" --output_dir "output/llama3" --disable_wandb