Open Ramyyang opened 2 months ago
Hi! Thank you for your interest in our work. This is actually a modified version of peft lib repo. We have not yet provided the installation script of this lib, yet you can simply download it to your code directory and import this repo in your python code.
from peft import PeftModel, TaskType, GivensConfig, get_peft_model
, and define your configuration of Givens OFT:
givens_config = GivensConfig(task_type=TaskType.CAUSAL_LM,
inference_mode=False,
strict_oft=True(False),
no_scaling=True,
target_modules=(List of target module names to apply Givens OFT, e.g. LLaMA choices: ["q_proj", "k_proj", "v_proj", "o_proj", ...]))
And you can wrap up your transformers model with Givens trainable modules applied:
model = get_peft_model(model, givens_config)
./tuners/givens
subdirectory to implement our method.
Thanks for sharing your work on quasi-Givens Orthogonal Fine Tuning! I'm excited to try it out but couldn't find instructions on how to use the code. Could you please provide some guidance on: