qthequartermasterman / torch_pso

Particle Swarm Optimization implemented using PyTorch Optimizer API
MIT License
16 stars 1 forks source link

Unit tests to ensure that calculating the gradients does not affect on Particle Swarm Optimization #18

Open qthequartermasterman opened 2 years ago

qthequartermasterman commented 2 years ago

Is your feature request related to a problem? Please describe. PSO algorithms are generally gradient-free, so performing back propagation and zeroing gradients should have no effect on PSO steps. We need unit tests to ensure this behavior.

Describe the solution you'd like Such a unit test should set the seeds for torch's random generator, clone the parameters to be trained, and run the optimizer on one without any gradient calculation, and the other by performing the backward method on the loss tensor being fed into the optimizer.

Additional context Related to #1: Most checks on which parameters to adjust or not in most gradient-based optimizers consider whether or not the gradient is calculated upon calling the optimizer step function. That behavior is directly contradictory (I believe) to this behavior. An alternative could be to simply check if the parameter requires grad. More research is needed.

Related to #17: This is the issue that prompted the need for a unit test.