orbingol / NURBS-Python

Object-oriented pure Python B-Spline and NURBS library
https://onurraufbingol.com/NURBS-Python/
MIT License
636 stars 156 forks source link

Singularity in fitting.interpolate_surface() #132

Open sphh opened 3 years ago

sphh commented 3 years ago

I try to fit a surface through a field of points, where one row of points collapse to a singularity, similar to this: https://developer.rhino3d.com/guides/general/essential-mathematics/parametric-curves-surfaces/#singularity-in-nurbs-surfaces

At the moment I get a ZeroDivisionError:

...
  File "/usr/local/lib/python3.8/dist-packages/geomdl/fitting.py", line 38, in interpolate_curve
    uk = compute_params_curve(points, use_centripetal)
  File "/usr/local/lib/python3.8/dist-packages/geomdl/fitting.py", line 452, in compute_params_curve
    uk[i] = sum(cds[0:i + 1]) / d
ZeroDivisionError: float division by zero

Is there a workaround to get such a surface? Is it possible at all?

orbingol commented 3 years ago

compute_params_curve needs to compute the chord length and when there is a singularity, the chord length becomes zero. I think there are 2 things to do:

  1. It might not work perfectly but there could be a chance to fix the issue by catching ZeroDivisionError and filling the uk list separately
  2. Put a very small distance between the data points that are causing the singularity. Therefore, the division will be a very small number but not zero
sphh commented 3 years ago

Currently I use 2, but would like to have a proper singularity.

If 1. is possible, that would be great. What would uk look like if d→0? Is this limes defined?

orbingol commented 3 years ago

I'll need to test for Option 1. I am not sure if it will even work. The concept of chord length breaks the idea of singularity. However, it looks possible when we look at it from the programming perspective.

portnov commented 3 years ago

Usually in NURBS algorithms, if you're encountering 0/0, you just say it is 0, and algorithm works fine. Does this trick help here?

orbingol commented 3 years ago

Usually in NURBS algorithms, if you're encountering 0/0, you just say it is 0, and algorithm works fine. Does this trick help here?

I was thinking the same thing but I'd prefer testing before coming to the conclusion. Sometimes, it ends up being a completely different bug.

sphh commented 3 years ago

I replaced line 452

        uk[i] = sum(cds[0:i + 1]) / d

with

        try:
            uk[i] = sum(cds[0:i + 1]) / d
        except ZeroDivisionError:
            if sum(cds[0:i + 1]) == 0:
                uk[i] = 0
            else:
                raise

(maybe this is faster:

        if (s := sum(cds[0:i + 1])) == 0 and d == 0:
            uk[i] = 0
        else:
            uk[i] = s / d

is faster? But it needs Python ≥3.8)

Both my test and real cases are working perfectly with both changes.

orbingol commented 3 years ago

ZeroDivisionError occurs when d == 0. If sum(cds[0:i + 1]) == 0 and d != 0, it will evaluate to 0. I think it would be enough to update the line 452 like this:

try:
   uk[i] = sum(cds[0:i + 1]) / d
except ZeroDivisionError:
   uk[i] = 0

Note: I haven't tested it.

Elimination of if-else will automatically remove some of the overhead and it will become just a bit faster.

sphh commented 3 years ago

I have not looked into the code more closely, but I wanted to implement it strictly according to @portnov! If sum(cds[0:1 + 1]) is always 0 whenever d==0, my approach is certainly overkill.

I have different experiences when I timed try ... except ... and if ... else ... ages ago. I remembered, that try ... took more time. I just had another look into it again and noticed, that it depends, how often the exception is triggered (https://stackoverflow.com/questions/1835756/using-try-vs-if-in-python, https://www.geeksforgeeks.org/try-except-vs-if-in-python/). Based on these findings (and that singularities do not appear often), it looks like your approach is the faster one. Still, is it assured, that the sum() is always 0 when d is 0?

sphh commented 3 years ago

Oh, well I managed to construct a test case, which fails:

import numpy as np

from geomdl import fitting
from geomdl.visualization import VisMPL

def ellipse(x, a, b):
    """Elliptic function."""
    return b * np.sqrt(1 - (x/a)**2)

def profile(x, a, b, t_te):
    """Profile with open trailing edge."""
    return ellipse(x - (a-t_te), a+t_te, b)

X = np.linspace(0, 1, 5, endpoint=True)
Y = np.linspace(0, 1, 5, endpoint=True)

sections = []
for xi in X:
    chord = ellipse(xi, 1, 0.5)
    if chord == 0:
        chord = 0.000001
    thickness = ellipse(xi, 1, 0.1)
    # Upper face
    section = [
        [xi, (yi-0.5)*chord, profile(yi, 0.5, thickness, 0.01)]
        for yi in Y]
    # Lower face
    section.extend([
        [xi, (yi-0.5)*chord, -profile(yi, 0.5, thickness, 0.01)]
        for yi in reversed(Y[:-1])])
    sections.append(section)

surface = fitting.interpolate_surface(
    [list(point)
     for section in sections
     for point in section],
    len(sections), len(sections[0]),
    3, 3)

vis = VisMPL.VisSurface()
surface.vis = vis
surface.render()

With this code I get the following wing, which is how I expect it to look like: Screenshot from 2021-06-16 16-54-15

If I comment out these lines (which removed the singularity a the tip) to have a singularity at the tip x=0:

#     if chord == 0:
#         chord = 0.000001

I get the following wing: Screenshot from 2021-06-16 16-54-45

portnov commented 3 years ago

@sphh as far as I understood, you want your profile to be elliptical. For ellipses and other conic sections there are known algorithms that define control points / weights / knot vectors of NURBS curves directly, without interpolation. Then you can apply lofting algorithm to generate surface. Such surface can be much "better" in several aspects, comparing to one which you can generate by interpolation. Just an example: Screenshot_20210616_211500

sphh commented 3 years ago

Well, it's true, that I used elliptical sections, section widths and thickness here. But this was only used to construct the simple test case and in real life different sections and distributions will be used. But thanks for pointing that out. It's highly appreciated. (@orbingol: Do you want to implement them in the geomdl.shapes package??)

BTW which nice program are you using here?

portnov commented 3 years ago

Blender (blender.org) + Sverchok (https://github.com/nortikin/sverchok) :)

portnov commented 3 years ago

as far as I see, geomdl.shapes.curve2d module already has methods for generating circles. To generate an ellipse, it is enough to generate a circle and then scale it's control points in one direction.

sphh commented 3 years ago

@portnov: As mentioned earlier, this is just a testcase, the actual case is much more complicated. So let us not get side-tracked by figuring out, how the test case can be created in a different/better/faster/easier way, ok?

sphh commented 3 years ago

Back to the original problem …

I managed to create a singularity. I succeeded, but needed another change in the fitting.py code.

The first one is mentioned in https://github.com/orbingol/NURBS-Python/issues/132#issuecomment-862412256, which catches d = sum(cds[1:-1]) == 0. But s = sum(cds[0:1+1]) can also become ==0. It that case setting uk[i] = 0 does not work. I now have the following code (fitting.py, line 452):

    for i in range(num_points):
        s = sum(cds[0:i + 1])
        if s == 0:
            uk[i] = i / (num_points - 1)
        else:
            try:
                uk[i] = s / d
            except ZeroDivisionError:
                uk[i] = 0

This produces a singularity and get's rid of the problem shown in the second picture in https://github.com/orbingol/NURBS-Python/issues/132#issuecomment-862454480.

If I were quizzed about the theoretical background of that change, I have to admit, that I cannot explain it! But it seems to be working for me and my use case (there might be edge cases, which fail …)

orbingol commented 3 years ago

I'd like to thank @portnov for the help. Actually, I like Sverchok and Blender, and I have been recommending Sverchok to many of my friends since I discovered it. For sure, the users may not be able to share the actual data here on a public Github repository and there is also a fact that it is very hard to handle all the corner cases. Here, @sphh provided one and I appreciate it.

I'd suggest a pythonic update here. I haven't tried it but theoretically, it should work fine. Let's say we created a function with @sphh's suggestions (we are not quizzing you at all :))

def my_compute_params(points, centripetal=False):
    if not isinstance(points, (list, tuple)):
        raise TypeError("Data points must be a list or a tuple")

    # Length of the points array
    num_points = len(points)

    # Calculate chord lengths
    cds = [0.0 for _ in range(num_points + 1)]
    cds[-1] = 1.0
    for i in range(1, num_points):
        distance = linalg.point_distance(points[i], points[i - 1])
        cds[i] = math.sqrt(distance) if centripetal else distance

    # Find the total chord length
    d = sum(cds[1:-1])

    # Divide individual chord lengths by the total chord length
    uk = [0.0 for _ in range(num_points)]
    for i in range(num_points):
           s = sum(cds[0:i + 1])
           if s == 0:
               uk[i] = i / (num_points - 1)
           else:
               try:
                   uk[i] = s / d
               except ZeroDivisionError:
                   uk[i] = 0

    return uk

I think it should be possible to override the compute_params_curve like this:

from geomdl import fitting
from mymodule import my_compute_params

fitting.compute_params_curve = my_compute_params

This looks like a cleaner way to me. What do you think?

sphh commented 3 years ago

Well, monkey-patching is always an option, until the upstream source is changed :wink: