ctongfei / nexus

Experimental tensor-typed deep learning
https://tongfei.me/nexus/
MIT License
257 stars 15 forks source link

Customizable gradient computation #34

Open ctongfei opened 5 years ago

ctongfei commented 5 years ago

For certain 2nd-order optimization algorithms, e.g. (Martens & Grosse, 2015 JMLR):

  def backward[G[_]: Algebra, X, Y](dy: G[Y], y: Y, x: X): G[X]

where G[_] encapsulates backward computations. Trivially G = Id.

ctongfei commented 5 years ago

G[_] = Batched[_] naturally leads to Jacobian (#33).