Open svaiter opened 2 years ago
thanks @svaiter !
would you have some code to share to start adding some of these solvers?
Message ID: @.***>
No. I always use (*) either FB on the dual for denoising, or CP w/o preconditionning for inverse problems/regression (I believe both are implemented). But I will try to give a hand asap to implement some of them.
(*) without any other good reason than ratio simplicity to implement / performance in favour of them.
We consider TV-like problem of the form
where $f,g$ are convex, lsc, closed, $A$ is a linear operator, $D$ is represents a finite difference operator, either in 1D or 2D We let $\ell(x) = f(Ax - y)$.
Depending on $f,g,A$ several algorithms can be implemented and below is a subset of them.
Data fidelity loss and metric on D x
Huber:
L1:
L1-L2:
MSE:
Note: Huber and L1 can be used either as a data fidelity metric, or as a way to measure the gradient ($|\cdot|^{2}$ can also be used for $g$ but then it $Γ$-converges towards the Sobovel energy).
Solvers
Direct methods
$A = \operatorname{Id}$ or orthogonal design, $D$ is whatever finite diff.
$f$ is the MSE
$g$ is the $\ell^{1}$-norm
[ ] Graph-cut based (Boykov, Veksler, and Zabih 2001) or (Kolmogorov and Zabin 2004) (note: this is the fastest way)
Alternative discretization (maybe not relevant)
Forward-Backward on the dual
The starting point is to consider the dual problem of the primal problem reads
Variants:
Proximal-dual hybrid gradient
The starting point is two possible saddle point problems equivalent \eqref{eq:tv-gen}
if $\ell$ is prox-easy or
if $g$ is prox-easy but $\ell$ is not prox-easy.
Variants: