Computational Mathematics for Learning and Data Analysis project for the a.y. 2021/2022.
(P) is the linear least squares problem $$\displaystyle \min_{w} \lVert \hat{X}w-\hat{y} \rVert$$ where
$$\hat{X}= \begin{bmatrix} X^T \newline \lambda I \end{bmatrix}, \hat{y} = \begin{bmatrix} y \newline 0 \end{bmatrix},$$
with $X$ the (tall thin) matrix from the ML-cup dataset by prof. Micheli, and $y$ is a random vector.
πComputationalMathematics
βββ π1_LBFGS # Limited-memory quasi-Newton method
β βββ πLBFGS.m # implementation of limited memory BFGS
β βββ πrun_lbfgs.m # choose the hyper-parameters and run L-BFGS
β βββ π...
βββ π2_QR # Thin QR factorization with Householder reflectors
β βββ πcheck_accuracy_thinqr.m # computes the accuracy of our implementation
β βββ πhouseholder_vector.m # builds the householder reflectors
β βββ πthinqr.m # implementation of thin QR factorization
β βββ πrun_qr.m # choose the hyper-parameters and run thin QR
β βββ π...
βββ π3_CG # Conjugate gradient method
β βββ πcg.m # non-optmized version of conjugate gradient
β βββ πcg_opt.m # optmized implementation of conjugate gradient
β βββ πrun_cg.m # choose the hyper-parameters and run conjugate gradient
β βββ π...
βββ π4_SMD # Standard momentum descent (heavy ball)
β βββ πsmd.m # implementation of standard momentum descent
β βββ πrun_smd.m # choose the hyper-parameters and run standard momentum descent
β βββ π...
βββ πdatasets # Datasets used by the project
β βββ ποΈML-CUP21-TR.csv
βββ πutilities # Methods for building the matrices, functions and gradients
β βββ πbuild_lls.m # builds the function and gradient of lls
β βββ πbuild_matrices.m # builds the required matrices
β βββ πcallback.m # computes the metrics
β βββ πcompare_scalability # comparison of each method scalability
βββ πREADME.md