Open smkim0220 opened 4 months ago
lora, bottleneck, prefix tuning are implemented in finetune.py but p-tuning is deleted? In LLM-Adapters/tree/main/peft/src/peft/tuners p-tuning is implemented, is there a reason it's missing from finetune.py?
lora, bottleneck, prefix tuning are implemented in finetune.py but p-tuning is deleted? In LLM-Adapters/tree/main/peft/src/peft/tuners p-tuning is implemented, is there a reason it's missing from finetune.py?