Optimization-AI / LibAUC

LibAUC: A Deep Learning Library for X-Risk Optimization
https://libauc.org/
MIT License
285 stars 38 forks source link

Updating a and b in the PESG optimizer #24

Closed rohan1561 closed 1 year ago

rohan1561 commented 2 years ago

Hello, I see that you update the model's parameters using the regime (momentum, decay etc) and you update the alphas according to the update equations. I wanted to know why you don't explicitly update the primal variables 'a' and 'b' in the PESG code. Is that happening internally in the autograd engine? Thanks!

yzhuoning commented 2 years ago

Hello, a, b will be integrated into model's parameters (since they follow gradient descent updates) after you initialize the PESG optimizer, which means a, b will be updated as long as you update the model's parameters. However, alpha is slightly different. The update follows gradient ascent and there is no v_ref for alpha in current optimization algorithm. Thus, alpha is not included in model's parameters and directly updating the alpha by the equations will be much easier.