issues
search
ShkalikovOleh
/
OptAlg
Reimplementation of optimization algorithms.
MIT License
0
stars
3
forks
source link
Move from Autograd to Jax
#25
Closed
ShkalikovOleh
closed
4 years ago
ShkalikovOleh
commented
4 years ago
Rewrite gradient methods via
jax
instead of
autograd
.
Added unified story structure
Starting point now have shape (n_var) instead of (n_var,1). For example:
x_0 = np.array([x_1, x_2, ..., x_n])
jax
instead ofautograd
.x_0 = np.array([x_1, x_2, ..., x_n])