Open chimezie opened 1 week ago
You can swap adapters like so:
model.load_weights("adapters1.safetensors"), strict=False)
# Swap to new adapters
model.load_weights("adapters2.safetensors"), strict=False)
For unloading weights.. it really depends what you want to do. If you just want to delete the entire model.. simply delete it:
del model
Though maybe for that you could say more about the use case..
Sweet. For unloading, what I had in mind was for making the combination of using model.load_weights(..)
followed by model.unload_weights(..)
an idempotent operation. So, I can run LoRA on a model for a bit (making a new adapter), then, during loss calculations, run some evaluations comparing results from the original model vs. the adaptation without keeping a redundant copy of the original, and continue until the LoRA is completed.
Currently, adapters can be loaded with:
However, there is no way to either unload the weights:
or to swap in a new one dynamically:
This is useful for several use cases, including DPO loss calculation & Dynamically serving LoRA Adapters