So far, if a module was wrapped due to modules_to_save, we handled access to the weight and bias attribute (albeit incorrectly in case of disabled adapters!). However, there could be more attributes than those that could be accessed, in which case we got an error so far.
Instead of special properties, we now implement a generic __getattr__ method that can deal with any attribute. The implementation is a bit complex to take into account the way that torch.nn.Module handles __getattr__.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
Resolves #2099
So far, if a module was wrapped due to
modules_to_save
, we handled access to theweight
andbias
attribute (albeit incorrectly in case of disabled adapters!). However, there could be more attributes than those that could be accessed, in which case we got an error so far.Instead of special properties, we now implement a generic
__getattr__
method that can deal with any attribute. The implementation is a bit complex to take into account the way thattorch.nn.Module
handles__getattr__
.