Open ev-br opened 9 years ago
LinearOperator should implement matvec
and rmatvec
, additionally scipy adds dot
method for it.
I don't consider this case important to spend time on it. And I don't know any suitable problem.
I think if a user knows that his operator conforms to scipy's LinearOperator, he can simply use aslinearoperator
and be safe.
It's an advertisement for broader applicability. Second, scipy LinearOperators are not necessarily the best ones out there. PyOperators are rumored to be a more polished package. (I'm not a user, so hearsay only here).
LinearOperator
is a simple wrapper it doesn't do anything, and least_squares
doesn't require anything from LinearOperator
except providing 3 methods. PyOperators is complete package with operator manipulations, etc (I don't know almost anything about it.) It's functionality is irrelevant to optimization.
So as I said, a user can use any implementation of operator, if it has matvec
and rmatvec
, for clarity and rigor wrap it with aslinearoperator
and you are done. We can't account for all of numerous packages existing in Python, least_squares
is guaranteed to work with LinearOperator
and that's not at all restrictive. And generally LinearOperator
is very thin and almost unnecessary case. Can you think of any problem where Jacobian will be some magical linear operator (not just sparse matrix wrapped with aslinearoperator
)?
I think we should showcase some interoperability with tools in the scipy stack but not in scipy itself. A nice example could be to have one more IPython notebook with a large-scale problem where the jacobian is implemented as a PyOperator, http://pchanial.github.io/pyoperators/ instead of a scipy LinearOperator. (if this doesn't work, it's IMO worth investigating what exactly are the requirements for a LinearOperator here)