I am looking for Way to reset optimizer for relora in native pytorch code i am using sth like this to reset optimizer for lora items for name, param in model.named_parameters(): if 'lora' in name: del optimizer.state[param] but i cant do this for Zero optimizer of deep speed , is there any way for this ?
I believe you could, the ZeRO optimizer keeps the param groups the same as the torch optimizer. Look in deepspeed/runtime/zero for the attributes you want to reset.
I am looking for Way to reset optimizer for relora in native pytorch code i am using sth like this to reset optimizer for lora items
for name, param in model.named_parameters(): if 'lora' in name: del optimizer.state[param]
but i cant do this for Zero optimizer of deep speed , is there any way for this ?