Closed shahules786 closed 1 year ago
Added support to configure PEFT from config and save WTE embeddings with adapter files to enable easy loading of OA Lora weights.
WTE
Earlier PEFT modules were hardcoded for llama model only. This was an issue when training other models using peft like RWModel, GPTNeoX, etc
Introduces extra parameter peft_config to config.yml
peft_config
:x: pre-commit failed. Please run pre-commit run --all-files locally and commit the changes. Find more information in the repository's CONTRIBUTING.md
pre-commit run --all-files
What
Added support to configure PEFT from config and save
WTE
embeddings with adapter files to enable easy loading of OA Lora weights.Why
Earlier PEFT modules were hardcoded for llama model only. This was an issue when training other models using peft like RWModel, GPTNeoX, etc
How
Introduces extra parameter
peft_config
to config.yml