Open peekxc opened 1 year ago
Thanks for reaching out and sharing your interests. Originally, we had plans to include additional functions, including arbitrary matrix functions. In response to your request, I have added the following functions to version 0.20.0: trexp
, trlinfrac
, eigencount
, and density
. I hope you find them useful for your applications. If you run into any issues or have more feedback, please feel free to share.
Additionally, if you could point to some references or resources related to the $\ell_1$/rank minimization task and PDE kernels associated with the functions you mentioned, it would be a great help to better understand the context.
Wrote a response but forgot to hit send;
In response to your request, I have added the following functions to version 0.20.0
Thank you, these have helped a bunch!
Additionally, if you could point to some references or resources related to the /rank minimization task and PDE kernels associated with the functions you mentioned, it would be a great help to better understand the context.
I've been working off a couple papers, but the main one being this paper[1], which is a bit dense, but quite general, as it encapsulates the constructions from other papers, like that of [2].
The gist is that integrating smoothed variations of the Dirac measure yields matrix functions with "nice" properties, including closed-form [sub]differentiability, certain monotonicity properties, and guarantees related to rank minimization, etc.
For example, for a positive semi-definite $A \in \mathbb{R}^{n \times n}$, the spectral sum:
$$ \mathrm{tr}(f(\Lambda)) = \sum\limits_{i=1}^n \frac{\lambda}{\lambda + \epsilon}, \quad \epsilon > 0 $$
ends up corresponding to this trace, for $A = X^T X$:
$$ \mathrm{tr}(f(\Lambda)) = \mathrm{tr}((X^T X + \epsilon I)^{-1} X^T X)$$
Notice that part of that expression is like the Tikhonov $\epsilon$-regularization applied to the columns of $X$.
It looks like maybe one could express this via a composition of AffineMatrixFunction's, and then interpolated similar to how you propose in your Statistics and Computing paper.
Another smoothing variation leads to $f(\lambda) = 1 - \mathrm{exp}(-\frac{\lambda}{\epsilon})$, which itself is related to the trace of the heat kernel of graph Laplacians. I'm aware of this is from its use in diffusion modeling, and also in generating spectral signatures for 3d meshes, but honestly PDE's is not really my area.
For $A \in \mathbb{R}^{n \times n}$ admitting an eigenvalue decomposition $A = U \Lambda U^T$ and $f : \mathbb{R} \to \mathbb{R}$ is a scalar function, according to the docs this package helps to estimate the quantity:
$$ \mathrm{tr}(f(A)) = \mathrm{tr}(U f(\Lambda) U^\top)$$
A couple of quantities I'm interested in computing:
Though some of these are mentioned, the only one's I see in the API docs are
logdet
,trace
,traceinv
, andschatten
. Does this package support (1) and (2), and does it support arbitrary matrix functions like (3) [if not from Python, what about on the C++ side?]?