Mikolaj / horde-ad

Higher Order Reverse Derivatives Efficiently - Automatic Differentiation library based on the paper "Provably correct, asymptotically efficient, higher-order reverse-mode automatic differentiation"
BSD 3-Clause "New" or "Revised" License
33 stars 6 forks source link

Investigate HaskTorch #55

Open Mikolaj opened 2 years ago

Mikolaj commented 2 years ago

I assumed @hasktorch wraps the extensive PyTorch API in some very smart Haskell type-level programming.

However, @arkadiuszbicz hypothesises that HaskTorch is using not PyTorch, but a CI library Torch that does not implement backpropagation, but only some more basic building blocks, such as tensor multiplications. If so, we could try using Torch instead of hmatrix for the basic operations. Also, this would mean HaskTorch implements backpropagation and so, given that we do this in an original and probably efficient way, it would be great to compare both the method and the performance. I'm sure we could fruitfully share other kinds of experience as well.

Mikolaj commented 2 years ago

Today we looked at some HaskTorch examples and they look cool, they have an untyped flavour as well as shape-typed with KnownNat, etc. However, it seems they use autograd from PyTorch for handling tape and backpropagation, which means they don't implement their own and so we can't share experiences there. However, as long as our benchmarks are in good shape, let's compare performance numbers with theirs and contact them and chat and share. The exchange is sure to be fruitful on the ML implementation and Haskell magic (type-level stuff in particular) fronts.