ioannisPApapadopoulos / RadialPiecewisePolynomials.jl

MIT License
2 stars 0 forks source link

Conversion from ZernikeAnnulus to Bubble #4

Open ioannisPApapadopoulos opened 1 year ago

ioannisPApapadopoulos commented 1 year ago

@dlfivefifty The conversion from ZernikeAnnulus to the hats+bubble on one element becomes increasingly ill-conditioned as m (the Fourier mode) gets larger. This is true for either of the ZernikeAnnulus(0,0) or ZernikeAnnulus(1,1) routes.

This is alleviated is we instead do a least squares solve to find the coefficients in hats+bubble (include an extra 2 columns in the ZernikeAnnulus(0,0) case or 4 columns in the ZernikeAnnulus(1,1) case).

Do you see an easy way around this? Should we just do least square solves for the coefficients for now? I've attached a snippet to show the difference in the ZernikeAnnulus(0,0) route.

using AlgebraicCurveOrthogonalPolynomials, SemiclassicalOrthogonalPolynomials, LinearAlgebra

# Function to expand in annulus with radius ρ = 0.5
ρ = 0.5
c1 = -400; c2 = 0; c3=0.6
function ua(xy)
    x,y = first(xy), last(xy)
    exp(c1*(x^2 + (y-c3)^2)) * (1-(x^2+y^2)) * ((x^2+y^2)-ρ^2)
end

function ann002element(t::T, m::Int) where T

    Q₀₀ = SemiclassicalJacobi{T}(t, 0, 0, m)
    Q₀₁ = SemiclassicalJacobi{T}(t, 0, 1, m)
    Q₁₀ = SemiclassicalJacobi{T}(t, 1, 0, m)
    Q₁₁ = SemiclassicalJacobi{T}(t, 1, 1, m)

    L₁₁ = (Weighted(Q₀₀) \ Weighted(Q₁₁)) / t^2
    L₀₁ = (Weighted(Q₀₀) \ Weighted(Q₀₁)) / t
    L₁₀ = (Weighted(Q₀₀) \ Weighted(Q₁₀)) / t

    (L₁₁, L₀₁, L₁₀)
end

Z = ZernikeAnnulus(ρ,0,0)
fz = Z[:, Block.(1:40)] \ ua.(axes(Z,1))

# Look at the conversion of column 38
b = 38;
c = ModalTrav(fz).matrix[:,b]
N = length(c)

# Form conversion matrix from ZernikeAnnulus(ρ,0,0) to hats+bubble
t = inv(one(T)-ρ^2)
(L₁₁, L₀₁, L₁₀) = ann002element(t, b÷2);
R = [L₁₀[:,1] L₀₁[:,1] L₁₁]

# Do the conversion via a direct solve
dat = R[1:N,1:N] \ c
norm(c, Inf), norm(dat, Inf) # Big coefficients relative to original expansions, eugh..
# (0.0012030351196273754, 271.66452286973276)

# Go via a least squares solve
dat = Matrix(R[1:N,1:N+2]) \ c
norm(c, Inf), norm(dat, Inf) # Much better coefficients
# (0.0012030351196273754, 0.024122418305535603)
norm(R[1:N,1:N+2]*dat-c) # machine error
# 1.2575586067638361e-18
dlfivefifty commented 1 year ago

If we drop just the first column we get something reasonable:

julia> n = 1000; cond(L₁₁[2:n+1,1:n])
2361.354956062211

This is equivalent to adding a single hat function, i.e., inverting [[1; zeros(n)] L₁₁[1:n+1,1:n]].

Note L₁₁ is a 3-term recurrence so one approach is to think of it similar to:

https://github.com/JuliaApproximation/SingularIntegrals.jl/blob/96fd60ea4241fb55c5b93d32a15788e2be12d451/src/recurrence.jl#L9

The difference: in RecurrenceArray we use adaptive LU (Olver's algorithm) for the bottom right and back-substitution for the top left, where here we want to use LU for the top left and back-substitution for the bottom right. This would be equivalent to adding a 2nd hat function but whose coefficients are in the mth entry, i.e., invert the following:

julia> n = 1000; A = [[[[1; zeros(m)] L₁₁[1:m+1,1:m-1] [zeros(m); 1]]; zeros(n-m+1,m+1)] L₁₁[1:n+2,m:n]]
1002×1002 Matrix{Float64}:
 1.0   0.0313853    0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0         …   0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0261561    0.0390535    0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0  -0.00360819   0.0303656    0.0425396    0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0         -0.00674966   0.0306521    0.0440552   0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0         -0.00984693   0.0291571   0.0444601   0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0         -0.0128444   0.0266986   0.0441655   0.0         0.0         0.0         0.0          0.0         …   0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0        -0.0157356   0.0236496   0.0433874   0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0        -0.018531    0.0202074   0.0422551   0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0        -0.0212245   0.0165406   0.0408864   0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0        -0.0237547   0.0128923   0.0394372    0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0        -0.0259979   0.00958409   0.0380909   …   0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0        -0.027826     0.00689006      0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0         -0.029189        0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0         …   0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 ⋮                                                        ⋮                                                           ⋮                        ⋱                                          ⋮                                                               ⋮           
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0         …   0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0         …   0.0332341    0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             1.66801e-7   0.0332341    0.0          0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0            -0.0332005    2.36779e-7   0.0332341    0.0          0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0         -0.0332005    2.21268e-7   0.033234     0.0          0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0         -0.0332005    2.39307e-7   0.033234     0.0          0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0         …   0.0          0.0          0.0         -0.0332006    2.61149e-7   0.033234     0.0          0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0         -0.0332006    2.61838e-7   0.033234     0.0         0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0         -0.0332006    2.13674e-7   0.033234    0.0          0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0         -0.0332006    2.5704e-7   0.033234     0.0
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0         -0.0332006   2.55978e-7   0.0332339
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0         …   0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0        -0.0332007    2.6622e-7
 0.0   0.0          0.0          0.0          0.0         0.0         0.0         0.0         0.0         0.0         0.0          0.0             0.0          0.0          0.0          0.0          0.0          0.0          0.0          0.0         0.0         -0.0332007

julia> norm(inv(A), Inf)
938.5268517143461

But this is weird when thinking about hat functions....

dlfivefifty commented 1 year ago

A similar thing happens when ρ = 0:

julia> L₁₁ = Weighted(Jacobi(m,0)) \ Weighted(Jacobi(m+1,1))
(ℵ₀×ℵ₀ LazyBandedMatrices.Bidiagonal{Float64, LazyArrays.BroadcastVector{Float64, typeof(/), Tuple{InfiniteArrays.InfStepRange{Float64, Float64}, InfiniteArrays.InfStepRange{Float64, Float64}}}, LazyArrays.BroadcastVector{Float64, typeof(/), Tuple{InfiniteArrays.InfStepRange{Int64, Int64}, InfiniteArrays.InfStepRange{Float64, Float64}}}} with indices OneToInf()×OneToInf()) * (ℵ₀×ℵ₀ LazyBandedMatrices.Bidiagonal{Float64, LazyArrays.BroadcastVector{Float64, typeof(/), Tuple{InfiniteArrays.InfStepRange{Float64, Float64}, InfiniteArrays.InfStepRange{Float64, Float64}}}, LazyArrays.BroadcastVector{Float64, typeof(/), Tuple{InfiniteArrays.InfStepRange{Int64, Int64}, InfiniteArrays.InfStepRange{Float64, Float64}}}} with indices OneToInf()×OneToInf()) with indices OneToInf()×OneToInf():
  0.17316      ⋅          ⋅           ⋅           ⋅          ⋅          ⋅          ⋅          ⋅        …  
  0.15735     0.304348    ⋅           ⋅           ⋅          ⋅          ⋅          ⋅          ⋅           
 -0.0158103   0.264348   0.406154     ⋅           ⋅          ⋅          ⋅          ⋅          ⋅           
   ⋅         -0.04       0.337778    0.486772     ⋅          ⋅          ⋅          ⋅          ⋅           
   ⋅           ⋅        -0.0683761   0.38825     0.551724    ⋅          ⋅          ⋅          ⋅           
   ⋅           ⋅          ⋅         -0.0985222   0.422692   0.604839    ⋅          ⋅          ⋅        …  
   ⋅           ⋅          ⋅           ⋅         -0.129032   0.445748   0.648841    ⋅          ⋅           
   ⋅           ⋅          ⋅           ⋅           ⋅        -0.159091   0.460606   0.685714    ⋅           
   ⋅           ⋅          ⋅           ⋅           ⋅          ⋅        -0.188235   0.469498   0.716927     
   ⋅           ⋅          ⋅           ⋅           ⋅          ⋅          ⋅        -0.216216   0.474012     
   ⋅           ⋅          ⋅           ⋅           ⋅          ⋅          ⋅          ⋅        -0.242915  …  
   ⋅           ⋅          ⋅           ⋅           ⋅          ⋅          ⋅          ⋅          ⋅           
   ⋅           ⋅          ⋅           ⋅           ⋅          ⋅          ⋅          ⋅          ⋅           
   ⋅           ⋅          ⋅           ⋅           ⋅          ⋅          ⋅          ⋅          ⋅           
   ⋅           ⋅          ⋅           ⋅           ⋅          ⋅          ⋅          ⋅          ⋅           
   ⋅           ⋅          ⋅           ⋅           ⋅          ⋅          ⋅          ⋅          ⋅        …  
   ⋅           ⋅          ⋅           ⋅           ⋅          ⋅          ⋅          ⋅          ⋅           
  ⋮                                                         ⋮                                          ⋱  
dlfivefifty commented 1 year ago

Note when the RHS is a bubble function the two "hat function" coefficients give zero:

c = L₁₁[1:n+2,1:n] * randn(n); # lies in the span of L₁₁, i.e., coefficients of a bubble function

julia> (A\c)[1]
1.016547956922409e-15

julia> (A\c)[m+1]
-6.158268339717665e-17

So at the very least we have a stable way of projecting on to the bubble functions (invert A and ignore the two added columns).... but I don't think there is a guarantee what is left over is degree 1.

I have a feeling when we think more about it we will realise that the hat function corresponding to the inner radius should be degree m, not degree 1....Something I've been discussing with Kaibo is relevant. we can think of the "face bubbles" as a basis for F ≅ ℝ[x,y] / B where B := <(1-x^2-y^2) * (x^2 - y^2 - ρ^2)> is the vanishing ideal of "bubble functions" so that ℝ[x,y] = F ⊕ B. Note any inner product for F lies naturally on the zero set of B, i.e., the boundary of the annulus. A key observation is given a basis for any choice of F, we can add arbitrary elements from B without changing the algebraic structure and without changing the span of the direct sum. So one way to think of this is we want to figure out which bubble functions we need to add to get nice structure.

Check out https://github.com/dlfivefifty/LegoBrickFEMGrant/blob/main/sparsemass.pdf for a general way of writing this (quick, before I kick you off that repos)