Open moeputt opened 3 months ago
I also encountered the same issue with transformers==4.44.1
.
I suspect the problem arises because the newer version of transformers
has changed CLIPAttention
to CLIPSdpaAttention
, causing LoRA to be unable to locate and modify this part.
There are two ways to resolve this:
--lora_clip_target_modules="{'CLIPSdpaAttention'}"
to your training script.transformers
to version 4.25.1
.
Hello, When I use the "default" LoRA-pti fine-tuning script from the repo home-page, I keep getting the following error:
Does anybody know what could be causing this?
For reference, here is the training script I'm using:
For reference I've tried running this + reinstalling on two completely separate machines, each with 24GB of GPU RAM. Thanks!