cornellius-gp / gpytorch

A highly efficient implementation of Gaussian Processes in PyTorch
MIT License
3.59k stars 562 forks source link

[Feature Request] Better check for CG non-convergence #1551

Open KeAWang opened 3 years ago

KeAWang commented 3 years ago

🚀 Feature Request

Better check for CG-non-convergence.

Motivation

CG can not converge due to numerical issues, especially in FP32 which is our default. Instead of just running for a bunch of CG iterations which can slow down runs immensely on large datasets, we should have a smarter check for non-convergence and just early stop. One check would be to keep track a proxy of the A norm errors, e.g. theorem 38.2 of http://www.cs.cmu.edu/afs/cs/academic/class/15859n-f16/Handouts/TrefethenBau/NumericalLinearAlgebrachapter38.pdf

Pitch

Add a check for monotonic convergence of errors in the A norm when solving Ax = b. If the error between iterations increases, break. You can't compute the errors exactly since that would require knowing x, but perhaps we could something that approximates it?

gpleiss commented 3 years ago

I'd be open to a PR, but it shouldn't be a check every iteration (maybe every 50 iterations - and make this a user-changeable setting).