sparsemat / sprs

sparse linear algebra library for rust
Apache License 2.0
381 stars 45 forks source link

Arbitrary tensor products #287

Open experiment9123 opened 3 years ago

experiment9123 commented 3 years ago

ideas/discussion - "sparsity aware element feedback" use case would be learning rules for AI using sparse matrices, e.g. steps of backprop modifying a matrix of weights, or whatever else AI researchers may imagine. (eg backprop: 'vec_a' would be previous layer activations, 'vec_b' would be error values). Also further bridges the gap between a "sparse matrix lib" and 'graph processing'

This is trivial enough for dense matrices, fairly trivial for COO vs dense vectors, trickier for any compressed sparse formats X sparse vectors, and where it would get extremely useful (and difficult) is threaded implementations of these.

One may also want to consider different permuations of what to do with empty elements (eg would we want to apply this with all 'a[j]','b[j]',and m[i][j] occupied, or any occupied m[i][j] for either a[i] or b[j] occupied IMG_3832

mulimoen commented 3 years ago

The hadamard product is rather simple, but requires left and rigth to have the same sparsity structure. In which case you can iterate over vec.data() of lhs and rhs.

For the tensor product we have some code, as this is simply a Nx1 by 1xM´ matrix product. You can have a look at thesmmp` part of the code to get an idea on how to parallelize this. The serial part of this requires figuring out the sparsity pattern of the output, but this section could be optimized in the case of vectors.