waylonflinn / weblas

GPU Powered BLAS for Browsers :gem:
MIT License
702 stars 43 forks source link

Sparse matrices #35

Open goodmansasha opened 7 years ago

goodmansasha commented 7 years ago

Given the limited memory and bandwidth in a browser, sparse matrices could speed up transmission times and lower the memory usage.

Some example use cases: (1) If a model inefficiently creates many zero weights (e.g. a ReLU nonlinearity), then this could be useful. I speak hypothetically, not from experience here. (2) Representing graphs of nodes and edges like in a network would benefit from a formal sparse format since networks often have many zeros. Analyzing reachability in a graph requires multiplying a matrix with itself. (3) In language models there can be very large one-hot vectors representing input data which could be more memory efficient. If a matrix multiply could handle a sparse input vector and dense matrix/array weight vector that would be cool.

Examples of existing javascript versions: http://mathjs.org/examples/sparse_matrices.js.html http://numericjs.com/wordpress/?p=26

Existing codebase: https://www.tensorflow.org/api_docs/python/sparse_ops/ http://faculty.cse.tamu.edu/davis/suitesparse.html https://cran.r-project.org/web/packages/Matrix/Matrix.pdf

waylonflinn commented 7 years ago

I really want this too. I'm still trying to figure out how much needs to be in this library and how much should be implemented in a layer on top of it.

Thanks for adding a place to discuss it!