The total least squares examples now run the classical algorithm in addition to the randomized algorithm. We compare runtime and solution quality for the two methods.
I've added three examples for low-rank approximation of sparse matrices.
QB-based SVD of randomly generated synthetic test matrices (rank one plus noise).
QB-based SVD of any sparse matrix in Matrix Market format.
Low-rank QRCP of a sparse matrix in Matrix Market format.
Changes to sparse matrix functionality
I've added two kernel implementations for SPMM of row-major data. This was needed to address cache inefficiencies revealed in the existing SPMM kernels during low-rank QRCP benchmarking.
I've removed the requirement for A.reserve((int64_t) 10) being able to execute for a matrix A that's compatible with the SpMatrix concept. Apparently this didn't work with sparse matrices marked as const.
I've also added two files of developer notes: RandBLAS/DevNotes.md and RandBLAS/sparse_data/DevNotes.md.
Changes to examples
Changes to sparse matrix functionality
A.reserve((int64_t) 10)
being able to execute for a matrixA
that's compatible with theSpMatrix
concept. Apparently this didn't work with sparse matrices marked asconst
.I've also added two files of developer notes:
RandBLAS/DevNotes.md
andRandBLAS/sparse_data/DevNotes.md
.