Open ea83c4 opened 8 years ago
Hi there, thanks for the request. I agree we should have some options for generic low-rank matrices. We'll look into this and let you know when we have something implemented.
Great. Are you able to give an estimate on the time it may take to close the issue, e.g. days, weeks, or months?
We're working on it now, nearly done. Should hopefully be done by the end of today. If not then tomorrow 🙂
the manifold we are currently implementing is from Manopt's fixedrankembeddedfactory.m
Hi.
We have implemented the above manifold, currently only supporting the autograd autodiff backend. But theano and tensorflow should be fixed soon (~1-2 weeks).
You can install this updated version like this:
pip install git+https://github.com/pymanopt/pymanopt.git --upgrade
Please let us know if there are any problems.
Cheers.
Great! The Autograd backend is what I am using at the moment. I'll let you know if there are any issues with the current implementation. Best.
We've also written a simple example for using this manifold: https://github.com/pymanopt/pymanopt/blob/master/examples/low_rank_matrix_approximation.py
The solvers return an array of type _Point
for this manifold (something we will iron out when we have time). For now this can easily be cast to a numpy.ndarray
by doing numpy.array(x)
.
Reopening until we have sorted out Theano and Tensorflow backends.
https://github.com/pymanopt/pymanopt/commit/838cf87eec2b997fe3876c2c46ec8e95b48ee9d5
In the interests of simplicity and efficiency, we've decided that points on this manifold should be parameterised via their low rank svd. That is, they are parameterised similarly to the output of numpy.linalg.svd
, as a tuple (u, s, vt)
, where u
is a m x k array, s is a length k vector and vt is a k x n array.
The full low rank matrix can be reconstructed with
a = np.dot(u, np.dot(np.diag(s), vt))
and cost functions defined in terms of this reconstructed matrix.
See https://github.com/pymanopt/pymanopt/blob/master/examples/low_rank_matrix_approximation.py for a full example. I will also be writing a blog post about this, I will add a link when it's done.
Note that you can now define cost functions using Theano or TensorFlow, however the second order trust regions solver is not yet working on this manifold. Hope that's ok, let us know if you have any issues.
Note that only SteepestDescent and ConjugateGradient solvers are working at the moment, while TrustRegions, NelderMead, ParticleSwarm fail. In some cases this seems to be due to the svd/tuple data structure for the low-rank matrix. It would be nice to try out more solvers when benchmarking...
There are two unimplemented methods which are necessary for those solvers. ehess2rhess
is necessary for TrustRegions and dist
is necessary for NelderMead and ParticleSwarm.
To implement ehess2rhess
someone needs to do some slighty tricky derivation to work out the correct form for our parameterization (which is a bit different to Manopt). If I have time at some point I can do this. In case someone else is thinking of having a go, this may come in handy: https://j-towns.github.io/papers/svd-derivative.pdf. The reason we've left this issue open is to remind ourselves to do this at some point.
I believe the correct form of this manifold's dist
method, which calculates the geodesic distance between two points on the manifold, is not known, and it's an open problem to work this out. I don't know whether it would be possible to come up with a good approximation. One very naive thing that comes to mind is just the Frobenius distance between the two matrices, and this might do for the purposes of those two solvers.
When working on matrix recovery problems, often a more generic low-rank matrix is thought after than the currently implemented symmetric positive semi-definite matrix. Could someone implement one of the respective manifolds provided in manopt? Is something already in the works?