Open regexident opened 7 years ago
That's certainly a good question!
I'm really happy to see more work on pure rust linear algebra algorithms. Rulinalg has been rather dormant for some time now, so it's good to see things are happening in the ecosystem.
As for making use of each other's effort, and assuming the continued co-existence of the two libraries, I'm sure we can learn a lot from each other. Beyond that, did you have anything particular in mind?
Both projects could benefit from each other, like GCC and LLVM do, but differents projects have different objectives and differents leaderships. As for myself, I think optimized linear algebra is so huge and complex that a better, faster, stronger, and unified project would be a awesome standard reference for the Rust ecosystem.
Beyond that, did you have anything particular in mind?
I basically stumbled upon the announcement and thought "wait a second, lots of these have open issues on rulinalg, maybe there is a chance for symbiosis here". :wink:
I have only had time to loosely follow what has been happening with nalgebra but I agree that it is good to see things moving in the ecosystem.
Unfortunately I just haven't found the time to pick up my own slack and get things moving with rulinalg again. I would be more than happy to see if there is a way we can work together towards some greater good. I am also a little unsure about exactly how this relationship would work - especially given the lack of activity on my end. But I'm very open to any ideas about how we could make a meaningful proposal.
FYI:
Implementing matrix decompositions (Cholesky, LQ, sym eigen)
as differentiable operators: https://arxiv.org/abs/1710.08717
Abstract:
Development systems for deep learning, such as Theano, Torch, TensorFlow, or MXNet, are easy-to-use tools for creating complex neural network models. Since gradient computations are automatically baked in, and execution is mapped to high performance hardware, these models can be trained end-to-end on large amounts of data. However, it is currently not easy to implement many basic machine learning primitives in these systems (such as Gaussian processes, least squares estimation, principal components analysis, Kalman smoothing), mainly because they lack efficient support of linear algebra primitives as differentiable operators. We detail how a number of matrix decompositions (Cholesky, LQ, symmetric eigen) can be implemented as differentiable operators. We have implemented these primitives in MXNet, running on CPU and GPU in single and double precision. We sketch use cases of these new operators, learning Gaussian process and Bayesian linear regression models. Our implementation is based on BLAS/LAPACK APIs, for which highly tuned implementations are available on all major CPUs and GPUs.
With the latest release (v0.13) nalgebra now supports Rust-native implementations of the following matrix factorizations:
This made me wonder if instead of reinventing the wheel for rulinalg, maybe rulinalg and nalgebra should join or make use of the other side's efforts with these operations?
cc @sebcrozet
Related issues (varying degrees), but not limited to: