RalphAS / GenericSchur.jl

Julia package for Schur decomposition of matrices with generic element types
Other
26 stars 4 forks source link

generalised Schur decomposition for non-symmetric matrices on GPU #3

Closed thorek1 closed 1 year ago

thorek1 commented 4 years ago

Hi,

I tried running GenericSchur on GPU (via Google Colab) and unfortunately get stuck very early on. He is resorting to LAPACK Schur, which is not implemented for GPUs.

Having a GPU implementation of generalised Schur for non-symmetric matrices would be a great addition. So far, neither cublas, arrayfire, nor any other blas/lapack alternative has it. Julia seems to be promising in getting there.

I am not proficient enough to debug the code myself but happy to help testing. I ran the following notebook to test and added the following code snippets to have GPU support in Julia: using Pkg Pkg.add(["BenchmarkTools","CUDA","GenericSchur","GenericLinearAlgebra"]) using BenchmarkTools,CUDA,GenericLinearAlgebra import GenericSchur size = 50 arand = rand(size,size) brand = rand(size,size) GenericSchur.schur(arand,brand) agpu = CuArray(arand) bgpu = CuArray(brand) GenericSchur.schur(agpu,bgpu)

I hope this helps in getting closer to the issue.

RalphAS commented 1 year ago

This isn't practical here because of all the scalar indexing in the packaged algorithms. Making practical two-sided decompositions for GPUs is a research problem AFAICT, so I'll leave it to others.