domlysz / BlenderGIS

Blender addons to make the bridge between Blender and geographic data
GNU General Public License v3.0
7.69k stars 1.35k forks source link

Delaunay performance #64

Closed s-leger closed 7 years ago

s-leger commented 7 years ago

Hi all, For most cases the triagulator providen with BlenderGis just work good, but when it comes to laser scanned datasets with billions points performance may be an issue. For such cases, scipy lib's Delaunay triangulator may be a good alternative. He's able to triangulate either in 2d and 3d.

scipy is available as python wheels for linux, osx and windows.

import bmesh
from scipy.spatial import Delaunay
import numpy as np
import time

# for use with a mesh based source
# select your source and run this in a python console

# delaunay 2.5D 
context = C
o = context.active_object
scene = context.scene
bpy.ops.object.mode_set(mode = 'EDIT') 
bm = bmesh.from_edit_mesh(o.data) 
points_3D = [list(v.co) for v in bm.verts]
points_2D = np.array([[v[0],v[1]] for v in points_3D])
t = time.time()
tri = Delaunay(points_2D)
print("Delaunay 2d :%s seconds" % (time.time()-t))
faces = tri.simplices.tolist()
bpy.ops.object.mode_set(mode = 'OBJECT')
mesh = bpy.data.meshes.new("MeshObject")
mesh.from_pydata(points_3D, [], faces)
mesh.update()
my = bpy.data.objects.new("MyMesh", mesh)
scene.objects.link(my)
print("Total :%s seconds  %s faces  %s verts" % (time.time()-t, len(faces), len(points_3D)))
#Total :42.67 seconds  4751900 faces  2386698 verts  Delaunay 2d 29.95 seconds
#Total :145.31 seconds  13285584 faces  6652790 verts Delaunay 2d :101.54 seconds

# delaunay 3D
context = C
o = context.active_object
scene = context.scene
bpy.ops.object.mode_set(mode = 'EDIT') 
bm = bmesh.from_edit_mesh(o.data) 
points_3D = np.array([list(v.co) for v in bm.verts])
t = time.time()
tri = Delaunay(points_3D)
print("Delaunay 3d :%s seconds" % (time.time()-t))
faces = tri.simplices.tolist()
bpy.ops.object.mode_set(mode = 'OBJECT')
mesh = bpy.data.meshes.new("MeshObject")
mesh.from_pydata(points_3D, [], faces)
mesh.update()
my = bpy.data.objects.new("MyMesh", mesh)
scene.objects.link(my)
print("Total :%s seconds" % (time.time()-t))
# 14 mio faces / 2.386 mio verts  total 189s delaunay 153s

Edit: use tri.simplices.tolist() to prevent error in from_pydata with numpy arrays

mirlip commented 7 years ago

Hi s-leger, Really helpfull :) Do you have time comparisons between scipy's and BlenderGIS versions?

EDIT: I just tried your script on Linux with scipy installed through pip, it crashes right away with a segfault.

s-leger commented 7 years ago

Sad ! Running under windows 10 here. Meshing the huge dataset takes more than 1 hour under blenderGis. As an alternative, have a look at cloudcompare.org (win and osx available, ubutu soon - hopefully) Performances for delaunay 2.5d are better since it make use of octree optimisation so the set gets triangulated within seconds.

mirlip commented 7 years ago

@s-leger On Windows 7, it gives following error:

Delaunay 3d :35.89924240112305 seconds
Traceback (most recent call last):
  File "G:\test\test.blend\Text", line 22, in <module>
  File "G:\blender\2.78\scripts\modules\bpy_types.py", line 440, in from_pydata
    if faces and (not edges):
ValueError: The truth value of an array with more than one element is ambiguous.
 Use a.any() or a.all()
Error: Python script fail, look in the console for now...
s-leger commented 7 years ago

Hi, get same issue here under windows 10. But at least here, this error dosent prevent successfull model creation. Seem to be a blender bug (as bpy_types.py belong to disto)

Edit: it's a numpy array evaluated as boolean issue, changed code according

domlysz commented 7 years ago

Your're right the python triangulator is too slow for process billions of points and is not suitable to reconstruct complex 3d scans. I'm affraid it can't be improved a lot, but it's pure python here, this allows avoid huge dependency like scipy. The code from scipy use the qhull lib writen in C which need compilation.

Maybee scipy as optional dependency can be a good alternative for user able to install it.

s-leger commented 7 years ago

At least we should provide solutions or alternative way to go for users facing same issues (edges cases). Cloudcompare does also use qhull under the hood and is able to do poisson recon too. A pretty good alternative under win and osx.

mirlip commented 7 years ago

@s-leger As far as I know, cloud compare doesn't have a python API and won't get one from the author as he said he doesn't know how to do it: http://www.danielgm.net/cc/forum/viewtopic.php?t=991. Could you expose some of CC's functions to a python API?

s-leger commented 7 years ago

I was looking at cloudcompare as an alternative way to build complex points clouds surfaces, but without any integration (and new dependency too). As delaunay tri rely on same code than in scipy, if we go for an external dep, i do vote for scipy.

Maybe only a few lines in the BlenderGis doc is enougth.

domlysz commented 7 years ago

yes it's not a big deal to just add scipy support when available

OllyFunkster commented 7 years ago

@s-leger, I'm trying your 2.5D script in an effort to triangulate a grid of vertices that's 4631 x 3684, and it's been running with one core pegged for over an hour now (Phenom II X6 3.3GHz). Is there any chance that scipy is not able to use qhull for some reason and is doing it some slower way without telling me?

EDIT: linux x64, blender built from source, some symbolic links put into site packages folder to make it able to find osgeo and scipy.

EDIT 2: never mind, seems it's choking on: points_2D = np.array([[v[0],v[1]] for v in points_3D]) before it even gets to the triangulation. How long is that step supposed to take?

s-leger commented 7 years ago

@OllyFunkster This is supposed to be fast. (at least not that slow) Tested here under ouin-ouin 10 x64 2.78a official release.

Points_3D :8.08 seconds verts:6'651'605 Points_2D :4.17 seconds verts:6'651'605 Delaunay 2d :94.34 seconds tri.simplices.tolist(): 16.08 seconds

Delaunay does work without the np.array() as input, but returned tri.simplices is a numpy.ndarray. So if there is any issue with numpy it may fail.

Note, could also be memory issue.

points_2D = [[v[0],v[1]] for v in points_3D]

Edit: For the records: 6'651'605 vertices

~18 times faster with scipy

s-leger commented 7 years ago

@domlysz official addon add_advanced_objects now implement your base Delaunay code. Added support for scipy right on it so we may close this issue.

domlysz commented 7 years ago

glad to see that

as reference https://developer.blender.org/diffusion/BA/browse/master/add_advanced_objects_panels/