microsoft / DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
https://www.deepspeed.ai/
Apache License 2.0
33.72k stars 3.96k forks source link

[REQUEST] How to access to the gradients (to manipulate) while the model is training? #3310

Open BilgehanSel opened 1 year ago

BilgehanSel commented 1 year ago

Does deepspeed offer an API to have access the gradients while training for any of the stages (1-2-3). When I try to access the gradients, I only get None types. I'm mostly interested in the stage 3. If there is no such API, is there any chance to delve into the deepspeed code to access them somehow, I would also be glad if someone just points me to the right direction in the source code for this. I'm not that concerned about any slowdowns in the code as long as I have access to the gradients. I also want to be able to manipulate/change the gradients before optimization steps.

tjruwase commented 1 year ago

@BilgehanSel, please see https://deepspeed.readthedocs.io/en/latest/zero3.html#debugging

BilgehanSel commented 1 year ago

Thank you for the response. I can confirm that I am able to access the gradients with deepspeed.utils.safe_get_full_grad() function. As also stated in my original question, how can I assign new values to the gradients?

Luodian commented 1 year ago

I was also wondering how to set the grad? Thank you in advance if someone could provide a solution.

This is pretty helpful when partially updating token embeddings, in this case, only a few index are updated while other's grad are set to zero before optimizer.step().

pangjh3 commented 10 months ago

Any solutions?

tjruwase commented 10 months ago

@Luodian, @pangjh3, apologies for the delay. We wiill add this feature soon to complement others.

song-wx commented 8 months ago

@Luodian, @pangjh3, apologies for the delay. We wiill add this feature soon to complement others.

Hi there, have you implemented this feature? I also need this feature in my case.