Parallel-in-Time / pySDC

pySDC is a Python implementation of the spectral deferred correction (SDC) approach and its flavors, esp. the multilevel extension MLSDC and PFASST.
http://www.parallel-in-time.org/pySDC
BSD 2-Clause "Simplified" License
32 stars 33 forks source link

Runtime Warning in test_Q_transfer_minimal[Equidistant] #149

Closed pancetta closed 2 years ago

pancetta commented 2 years ago

For some reason, test_Q_transfer_minimal[Equidistant] always seems to lead to a Runtime Warning in _polyint, see https://github.com/Parallel-in-Time/pySDC/runs/7786219453?check_suite_focus=true#step:5:134. This seems to happen each time, but I have no idea why.. @tlunet?

tlunet commented 2 years ago

Was it after some of my PR on master, or just always the case ?

pancetta commented 2 years ago

I don't know, to be honest.. I came across this when I removed warnings in v5, but could not wrap my head around this one. If you have an idea, shoot. If not, we'll just ignore it.

tlunet commented 2 years ago

Challenge accepted !

tlunet commented 2 years ago

Ok, got it. Seems to be scipy related, in there implementation of the BarycentricInterpolator :

https://github.com/scipy/scipy/blob/59dac8a9fa9ea856f4a50521d295a3497d648faa/scipy/interpolate/_polyint.py#L657-L673

In the evaluation formula for the barycentric interpolation (see https://parallel-in-time.org/pySDC/pySDC/core.html#module-core.Lagrange), there is a division by zero when an evaluated point coincides with a node. You can reproduce it with the following code :

from scipy.interpolate import BarycentricInterpolator
interp = BarycentricInterpolator([0, 1], [1, 2])
yi = interp(interp.xi)
print(yi)

This still produces the correct result, but since the np.sum(c,axis=-1) produces some zeros, then you get a division by zero at some point. Numpy allows this, but raise a warning. On the other side, the scipy implementation does not care, and simply "manually" modify those points after the comment # Now fix where x==some xi.

And since the interpolation_matrix_1d and restriction_matrix_1d functions in the transfer_helper.py still use the BarycentricInterpolator class, the test_Q_transfer.py script that uses those functions raises the warning.

Now there is several possibilities to remove this warning :

  1. (DIY) use the LagrangeApproximation class from pySDC core in the transfer_helper.py functions. The getInterpolationMatrix do something similar than the evaluation of the interpolating polynomial, so there is just some adaptations to do here to evaluate the interpolating polynomial with given values at the nodes. The warning won't be here, since it is specifically ignored when it should happen, see :

https://github.com/Parallel-in-Time/pySDC/blob/b05bff9192658f1ce4784b4a44d74aea7f2e2138/pySDC/core/Lagrange.py#L218-L228

However, this is less efficient than the scipy implementation, since it requires to compute the interpolation matrix for each different points, and compute the matrix vector multiplication. For a few points (which is the case in many pySDC applications), the scipy implementation is probably faster.

  1. (Lazy) in test_Q_transfer.py ignore division by zero warning when computing the Pcoll and Rcoll matrices, using the statement

    with np.errstate(divide='ignore'):
    # ...
  2. (Less Lazy) in transfer_helper.py, wrap the following lines (4 times) with the with np.errstate(divide='ignore') statement

    M[i, nn] = np.asarray(list(map(lambda x: x(p), bary_pol)))
  3. (Contributing to Community) do a PR on scipy repo that wrap the problematic line in the BarycentricInterpolator class with the with np.errstate(divide='ignore') statement, hope that they accept it, and wait for the next scipy version to be released.

pancetta commented 2 years ago

Wow, great analysis, thank you! I guess 3. makes the most sense for now. I can do this in the next round of clean-up. 4. is also something to consider, though.

tlunet commented 2 years ago

Just to mention : solution 4 has been done (wasn't too complicated after all). And I think the warnings should disappear with the coming 1.10 version of scipy ...