Open corwinjoy opened 1 year ago
FYI (for others) there has been some discussion of this already in Slack, in the operators channel.
A quick summary: A previous PR proposing to introduce SVD. One concern that came up in this PR is that there are different algorithms to compute a SVD, which produce different results. Hence, it was a bit ambiguous what an implementation is supposed to do. It would help if the spec is more precise (eg., may be using an attribute to specify what is required). It was felt that it would help to be driven by use-cases/models as to which implementation/algorithm is desired.
So, in short, the two points that arise are:
Hello. I am working on a project where we are building Pytorch Gaussian Process Models and converting them into ONNX format. (https://gpytorch.ai/). At runtime, some of these models need to perform linear algebra operations such as a Cholesky decomposition or a triangular matrix solve. So, we would like to add operators to support these linear algebra methods. I see that there has been periodic interest in this idea here before. Would the group be open to the idea of adding a new domain to support linear algebra operations?
I'm thinking of creating an operator domain like ai.onnx.linalg and starting to add key operations modeled after the numpy.linalg library (https://numpy.org/doc/stable/reference/routines.linalg.html). I realize that these operations are not core to neural networks, but many other kinds of models do make use of linear algebra routines. Is there interest in this idea? Has this already been done somewhere?