raskr / rust-autograd

Tensors and differentiable operations (like TensorFlow) in Rust
MIT License
487 stars 37 forks source link

Upgrade to ndarray 0.15 #61

Open elidupree opened 1 year ago

elidupree commented 1 year ago

I'm trying to use autograd in conjunction with another library (nshare) which only supports ndarray 0.15. I recognize that this incompatibility isn't autograd's fault, but it seems like the simplest solution would be for autograd to support the latest version of ndarray.

I forked autograd and made a quick attempt to implement this myself; it looks like most of the issues are straightforward renamings. The only real sticking point was some slicing stuff in array_ops. But unfortunately that is a bit beyond my current understanding of ndarray.

raskr commented 1 year ago

ndarray used internally is exported as autograd::ndarray. Will that resolve this issue?

elidupree commented 1 year ago

A good suggestion, but that's actually the first thing I tried! Because autograd::ndarray is ndarray 0.14, it can't be passed to nshare's functions, because they only take an ndarray 0.15 as an argument. I also tried using an older version of nshare that depended on ndarray 0.14, but it was too old to have the features I needed.