ShkalikovOleh / OptAlg

Reimplementation of optimization algorithms.
MIT License
0 stars 3 forks source link

Move from Autograd to Jax #25

Closed ShkalikovOleh closed 4 years ago

ShkalikovOleh commented 4 years ago
  1. Rewrite gradient methods via jax instead of autograd.
  2. Added unified story structure
  3. Starting point now have shape (n_var) instead of (n_var,1). For example: x_0 = np.array([x_1, x_2, ..., x_n])