Open dlfivefifty opened 9 years ago
We could also replicate chebfuns smart chop
With a basis normalization, we could implement scale-invariant adaptive QR, hopefully dealing with this:
julia> using ApproxFun
julia> u = [dirichlet(Chebyshev());Derivative(Chebyshev(),2)-I]\[exp(1.0);exp(-1.0)]
Fun([1.2660658777520082,-1.1303182079849696,0.2714953395340765,-0.04433684984866379,0.005474240442093731,-0.0005429263119139437,4.4977322954295136e-5,-3.19843646240199e-6,1.9921248066727958e-7,-1.103677172551734e-8,5.505896079673748e-10,-2.497956616984982e-11,1.0391522306785702e-12,-3.991263356393109e-14,1.4237580108211739e-15,-4.7409164601184453e-17,1.4801777040733999e-18],Chebyshev(【-1.0,1.0】))
julia> u = [dirichlet(Chebyshev());Derivative(Chebyshev(),2)-I]\[1e-32exp(1.0);1e-32exp(-1.0)]
Fun([1.0287204232101624e-32,-1.1395890362606558e-32],Chebyshev(【-1.0,1.0】))
If someone needed scale-invariance right now, they could just supply linsolve an estimate for the appropriate tolerance, but generally u
and f
are in different spaces with different norms.
Right now chop(::Fun) assumes all coefficients have equal weight. But many of the bases are not normalized. A routine like
basisapproxnorm(::FunctionSpace,k)
should be added to give an estimate "norm" of the k-th basis function.