v0.7.0: Orthogonal Fine-Tuning, Megatron support, better initialization, safetensors, and more
Highlights
Orthogonal Fine-Tuning (OFT): A new adapter that is similar to LoRA and shows a lot of promise for Stable Diffusion, especially with regard to controllability and compositionality. Give it a try! By @okotaku in huggingface/peft#1160
Support for parallel linear LoRA layers using Megatron. This should lead to a speed up when using LoRA with Megatron. By @zhangsheng377 in huggingface/peft#1092
It is now possible to choose which adapters are merged when calling merge (#1132)
IA³ now supports adapter deletion, by @alexrs (#1153)
A new initialization method for LoRA has been added, "gaussian" (#1189)
When training PEFT models with new tokens being added to the embedding layers, the embedding layer is now saved by default (#1147)
It is now possible to mix certain adapters like LoRA and LoKr in the same model, see the docs (#1163)
We started an initiative to improve the documenation, some of which should already be reflected in the current docs. Still, help by the community is always welcome. Check out this issue to get going.
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Bumps peft from 0.6.0 to 0.7.1.
Release notes
Sourced from peft's releases.
... (truncated)
Commits
67a0800
Release: 0.7.1 (#1257)971dd6e
Fix: Multiple adapters with bnb layers (#1243)ee6f6dc
FIX Issues with transformers 4.36 (#1252)21c304f
FIX Truncate slack message to not exceed 3000 char (#1251)e73967e
[docs] Quantization (#1236)b08e6fa
TST: Add tests for 4bit LoftQ (#1208)5c13ea3
FIX Use model argument consistently (#1198) (#1205)00b8200
Revert "FIX Pin bitsandbytes to <0.41.3 temporarily (#1234)" (#1250)504d3c8
[docs] PEFT integrations (#1224)fc9f4b3
Bnb integration test tweaks (#1242)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show