issues
search
adapter-hub
/
adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
https://docs.adapterhub.ml
Apache License 2.0
2.54k
stars
339
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Adding a `ReFT` notebook to the tutorials section
#741
julian-fong
opened
3 days ago
0
WIP: Generic Tests
#740
TimoImhof
opened
2 weeks ago
0
No improvement in training loss using ReFT Methods
#739
julian-fong
closed
4 days ago
2
Pluggable Model Integration Interface
#738
calpt
opened
1 month ago
0
`T5ForConditionalGeneration`: After calling adapters.init() the data_collator input misses `attention_mask`
#737
lenglaender
opened
1 month ago
0
Fix search on docs.adapterhub.ml
#736
TheoWeih
closed
1 month ago
0
Upgrade Transformers to v4.44.x
#734
calpt
closed
1 month ago
0
`ForwardContext` is `None` with gradient checkpointing enabled
#732
calpt
opened
1 month ago
0
Mistral Adapters
#731
dipankarsrirag
closed
1 month ago
0
Don't throw error if ForwardContext is not available
#730
calpt
closed
1 month ago
0
Trainer evaluation loop can't be used with `predict_with_generate`
#729
TimoImhof
opened
2 months ago
0
Upgrade Transformers to v4.43.x
#727
calpt
closed
1 month ago
0
Support for AST Model
#726
TheRealSal
opened
2 months ago
0
Remove loading adapters from 'ah' & source parameter
#724
calpt
closed
2 months ago
0
Fix Llama sdpa/ flash attention + adapters
#722
calpt
closed
2 months ago
0
Llama LoRA training not working with sdpa and flash attention
#721
calpt
closed
2 months ago
0
Upgrade black version
#720
calpt
closed
2 months ago
0
Upgrade Transformers to v4.42.x
#719
calpt
closed
2 months ago
0
Error Loading Adapter Without PEFT Configuration in EncoderDecoderModel
#718
leaBroe
closed
2 months ago
2
Adding a notebook for adapters `whisper` support
#717
julian-fong
closed
1 month ago
0
Update training.md
#716
Kotstantinovskiy
closed
2 months ago
0
[DOCS] Fixed typos inside notebook files and doc pages
#715
julian-fong
closed
2 months ago
0
Remove deprecated functionality
#714
calpt
closed
2 months ago
0
Looking forward to support for SAM
#713
SHEN2BAIYI
closed
2 months ago
1
Upgrade Transformers to v4.41.x
#712
calpt
closed
2 months ago
0
Add __init__.py to encoder-decoder
#711
calpt
closed
3 months ago
0
PLbart support
#709
FahadEbrahim
closed
2 months ago
3
Fix moving adapter head to device & Examples CI
#708
calpt
closed
3 months ago
0
Using EncoderDecoderModel with Adapters / No module named 'adapters.models.encoder_decoder'
#707
leaBroe
closed
3 months ago
5
Update Embedding Documentation
#706
hSterz
closed
3 months ago
0
Add ReFT (LoReFT, NoReFT, DiReFT)
#705
calpt
closed
3 months ago
7
Add download redirect for AH adapters to HF
#704
calpt
closed
3 months ago
1
Move custom head dict out of config
#700
hSterz
closed
3 months ago
2
Add `adapter_to()` method for moving & converting adapter weights
#699
calpt
closed
4 months ago
1
Add support for Task Arithmetics
#698
lenglaender
closed
2 months ago
0
Upgrade Transformers to v4.40.x
#697
calpt
closed
3 months ago
0
Fix reading model info in `get_adapter_info()` for HF
#695
calpt
closed
4 months ago
0
"`.to` is not supported for `4-bit` or `8-bit` bitsandbytes models" when i use load_best_model_at_end=True in QLoRa
#694
mkgs210
closed
4 months ago
3
Add support for Whisper
#693
TimoImhof
closed
1 month ago
0
Support saving & loading via Safetensors
#692
calpt
closed
4 months ago
0
QuestionAnsweringTrainer for adapter?
#687
km5ar
closed
5 months ago
1
Upgrade to Transformers v4.39.x
#686
calpt
closed
5 months ago
0
Use default head dropout prob if not provided by model
#685
calpt
closed
5 months ago
0
UNIPELT, UNIPELT (AP) and UNIPELT (APL)
#683
km5ar
closed
4 months ago
0
Fix Unipelt Lora default config
#682
calpt
closed
5 months ago
0
Switch resolving order if source not specified in `load_adapter()`
#681
calpt
closed
5 months ago
0
Custom Heads not working with adapters
#680
san-deep-reddy
closed
3 months ago
0
Fix compatibility of adapters with HF Accelerate auto device-mapping
#678
calpt
closed
5 months ago
0
Fix default cache path for adapters loaded from AH repo
#676
calpt
closed
5 months ago
0
regression head?
#675
km5ar
closed
5 months ago
4
Next