Open GoombaProgrammer opened 3 days ago
Ok, removing ["state_dict"] works (currently testing if training works).
I now get a different error. Might be because I changed the dataset a little.
I can investigate the issue in more depth much later, but for now what is the new error you are getting?
I can investigate the issue in more depth much later, but for now what is the new error you are getting?
norm_scale = self.dora_mag.weight.view(-1) / (torch.linalg.norm(new_weight_v, dim=1)).detach()
RuntimeError: The size of tensor a (3072) must match the size of tensor b (1536) at non-singleton dimension 0
That is the new error
Nope, even when not changing the dataset between 2 runs, it still gives that error
I trained a LoRA, stopped training, and I want to continue training from the same one. But using --lora-ckpt-path just errors with this traceback: