Open zer0int opened 2 weeks ago
So IIUC, GeometricLinear replaces what would normally be a nn.Linear
module and your goal would be to apply LoRA to this layer. As you observed, this is unfortunately not possible. Each layer type needs to be explicitly implemented in LoRA for it to be supported.
Feature request
Support for GmP (this = as applied to the structure (naming) of the original OpenAI/CLIP model), i.e.:
Motivation
I have had excellent results with GmP and CLIP ViT-L/14 (full fine-tune), CoCo-40k, batch_size=36 (!!!), boosting ImageNet/ObjectNet accuracy from ~0.84 (original OpenAI/CLIP ViT-L/14) to, most recently, >0.90 using GmP:
However, I am not sure if this could possibly even work without updating all weights during fine-tuning. I'd be delighted to know, so I could decide whether or not to try and pursue this project further - or if I should rather "pursue" a cloud computing instance and just train the full model. ;-)
Your contribution
I have modified the "laion/CLIP-ViT-bigG-14-laion2B-39B-b160k" with GmP, but got:
I have a working implementation of GmP for FULL model finetunes using the original OpenAI/CLIP. Entire code to reproduce my GmP modification + finetune / results is available here: https://github.com/zer0int/CLIP-fine-tune