Also, slightly change how we do layer.merge() and layer.unmerge(). Adding and then subtracting the updates does create small numerical errors (you never get exactly back to the starting point). So, instead we now save the previous result and restore that. It does mean, that one shouldn't change the layer while it is in the merged state. But since the merged state is meant only to transfer weights out of the layer, that should be fine.
Adding
LoRAConv2D
layer.Also, slightly change how we do
layer.merge()
andlayer.unmerge()
. Adding and then subtracting the updates does create small numerical errors (you never get exactly back to the starting point). So, instead we now save the previous result and restore that. It does mean, that one shouldn't change the layer while it is in the merged state. But since the merged state is meant only to transfer weights out of the layer, that should be fine.