gwlucastrig / Tinfour

Delaunay and Constrained Delaunay Triangulations in Java, providing high-performance utilities for modeling surfaces with support for Lidar LAS files, Digital Elevation Models (DEM), finite element analysis, path planning, natural neighbor interpolation, and other applications of Triangulated Irregular Networks (TIN)
Apache License 2.0
153 stars 34 forks source link

Perf: minor change SemiVirtualIncrementalTin::remove #37

Closed jandam closed 6 years ago

jandam commented 6 years ago

I suggest to skip Ear creation for nEar = 3. ) compute nRemoved edges in step 1: cavitation ) check for nEar=3 is moved before step 2 *) if nRemoved = 3 skip step 2: ear creation

Suggested improvement:

        // step 1: Cavitation
        //         remove vertex and create a polygonal cavity
        //         eliminating all connecting edges
        n1 = n0.getForward();

        int nRemoved = 0;
        do {
            n2 = n1.getForward();
            n3 = n2.getForwardFromDual();
            //n2 is to be deleted.  set the forward edge
            //of n2 to point to n3.
            n1.setForward(n3);
            n1 = n3;
            //if (n2.getB() == null) {
            //   vertexIsOnPerimeter = true;
            //}
            nRemoved++;
            edgePool.deallocateEdge(n2);
        } while (!n2.equals(n0.getDual()));

        n0 = n1;
        n1 = n0.getForward();
        if (nRemoved == 3) {
            // the removal of the vertex resulted in a single triangle
            // which is already Delaunay.  The cavitation process should
            // have reset the links.  So the removal operation is done.
            setSearchEdgeAfterRemoval(n1);
            return true;
        }

        // Step 2 -- Ear Creation
        //           Create a set of Devillers Ears around
        //           the polygonal cavity.
        int nEar = 0;
        n1 = n0.getForward();
        SemiVirtualEdge pStart = n0;
        SemiVirtualDevillersEar firstEar = new SemiVirtualDevillersEar(nEar, null, n1, n0);
        SemiVirtualDevillersEar priorEar = firstEar;
        SemiVirtualDevillersEar nextEar;
        firstEar.computeScore(geoOp, vRemove);

        nEar = 1;
        do {
            n0 = n1;
            n1 = n1.getForward();
            SemiVirtualDevillersEar ear = new SemiVirtualDevillersEar(nEar, priorEar, n1, n0); //NOPMD
            ear.computeScore(geoOp, vRemove);
            priorEar = ear;
            nEar++;
        } while (!n1.equals(pStart));
        priorEar.next = firstEar;
        firstEar.prior = priorEar;
jandam commented 6 years ago

Same change is applicable to IncrementalTin too.

gwlucastrig commented 6 years ago

Interesting idea. I will investigate.

I added some diagnostic code to the IncrementalTin class so that it would count how many cases where the nEar==3 would occur at the end of step 1. I then ran the "ExampleCrossValidation" test application from the example code provided with the Tinfour distribution. Cross validation involves systematically removing and then replacing each vertex in the triangulated mesh. So it was a quick way to test the proposed concept. I processed a subsection of the "Bear Mountain" lidar data sample that is described in the Tinfour documentation. For 9453 removals, the test resulted in a count of 29 nEar==3 cases.

Let me clarify the reasoning behind Martin's proposal. The nEar==3 case occurs when the vertex-to-be-removed connects to exactly 3 edges. When the vertex is removed, the remaining cavity is a triangle which happens to be Delaunay compliant (I think I have a proof for that). So if we counted the number of edges removed in Step 1 (as suggested), and it turned out to be 3, we could exit early. Counting the edges adds very little overhead to the process. So even though the nEar==3 case occurs infrequently, the proposed saving would be larger than the extra processing cost.

I will do some more tests to make sure that my numbers are correct. I also need to verify my assumption that the resulting triangular cavity would always be Delaunay.

jandam commented 6 years ago

I'm sorry I have no statistics on my data. Can you wait till Friday? I need to eliminate every piece of code and allocation that is not necessary. My data set is about 35k tiles each with ~100k vertices. About 60% of vertices are removed. This is extreme. I need to write better algorithms :-) This seems to me like simple change.

gwlucastrig commented 6 years ago

No worries. This is a simple change, so I'd say go ahead and make it yourself for your own copy of the Tinfour code. I'll migrate it into the main Tinfour code base sometime over the next week or so. I will also collect performance stats, so you don't have to do so if it is not convenient at this tim.

I suspect that you will find that you see slightly stronger speed improvement in the semi-virtual implementation than in the standard.

Děkuji

Gary

gwlucastrig commented 6 years ago

I performed performance testing with the new RepeatedDeleteTest application. The improvement was so small that I could not accurate measure it given to the noisy test environment on my Windows 7 computer (run times were too variable to demonstrate a statistically meaningful improvement). But the suggestion makes the code cleaner and more robust. So I decided to use it.

Thanks again for you suggestions and contributions to the Tinfour project.