google-deepmind / optax

Optax is a gradient processing and optimization library for JAX.
https://optax.readthedocs.io
Apache License 2.0
1.63k stars 174 forks source link

Add SPSA optimization method #357

Open ankit27kh opened 2 years ago

ankit27kh commented 2 years ago

The Simultaneous Perturbation Stochastic Approximation (SPSA) optimisation method is a faster optimisation method.

If the number of terms being optimized is p, then the finite-difference method takes 2p measurements of the objective function at each iteration (to form one gradient approximation), while SPSA takes only two measurements

It is also naturally suited for noisy measurements. Thus, it will be useful when simulating noisy systems.

The theory (and implementation) for SPSA is:

Furthermore, it is implemented:

More information: https://www.jhuapl.edu/SPSA/

mtthss commented 2 years ago

Sounds like a nice contribution, do you want to take a stab at it?

ankit27kh commented 1 year ago

@mtthss if no one is working on it, I would like to try. Can you provide me with some general points before I start, like which files to update, what to take care of etc., as I haven't contributed to optax before.

lockwo commented 1 year ago

Is there any updates on this? I have previously worked with SPSA in TF (https://github.com/tensorflow/quantum/pull/653) and would be interested in working on this but don't want to do redundant labor.

ankit27kh commented 7 months ago

Hi @lockwo, are you still interested? If you can implement SPSA, it'll be of great help!

fabianp commented 7 months ago

@ankit27kh : since there hasn't been activity for this in a year, I think it's safe for you to take over.

if you end up contributing this example, please do so to the contrib/ directory. Thanks!