We've been trying to import seismic data into Geoscience Analyst using Oasis Montaj databases and the CDI_to_Surfaces python notebook. Although not the intended purposes, it has moderately successful.
However, there appears to be an upper limit to the array size which the triangulation algorithm can handle. That limit is somewhere between 500 and 1000 array entities. Arrays with only 500 points will work, and generate a seismic data surface in GA. The image below shows the 500 entity array as a section in GA. Note the break is where I have separated the databases to avoid gridding artefacts due to some internal sorting (#45). However arrays with 1000 points throw the error below.
Any suggestions on a better way to tackle this issue?
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
C:\ProgramData\Anaconda3\envs\geoapps\lib\site-packages\geoapps\processing\cdi_surface.py in convert_trigger(_)
70
71 def convert_trigger(_):
---> 72 self.convert_trigger()
73
74 self.trigger.on_click(convert_trigger)
C:\ProgramData\Anaconda3\envs\geoapps\lib\site-packages\geoapps\processing\cdi_surface.py in convert_trigger(self)
220 tri2D = Delaunay(np.c_[np.ravel(Y), np.ravel(L)])
221 else:
--> 222 tri2D = Delaunay(np.c_[np.ravel(X), np.ravel(L)])
223
224 # Remove triangles beyond surface edges
qhull.pyx in scipy.spatial.qhull.Delaunay.__init__()
qhull.pyx in scipy.spatial.qhull._Qhull.__init__()
ValueError: No points given
@Coastal0 Sorry for the late reply.
That's interesting, the error message doesn't seem to be related to memory issues.
Any change you can share the file privately?: dominiquef@mirageoscience.com
Hi all,
We've been trying to import seismic data into Geoscience Analyst using Oasis Montaj databases and the CDI_to_Surfaces python notebook. Although not the intended purposes, it has moderately successful.
However, there appears to be an upper limit to the array size which the triangulation algorithm can handle. That limit is somewhere between 500 and 1000 array entities. Arrays with only 500 points will work, and generate a seismic data surface in GA. The image below shows the 500 entity array as a section in GA. Note the break is where I have separated the databases to avoid gridding artefacts due to some internal sorting (#45). However arrays with 1000 points throw the error below.
Any suggestions on a better way to tackle this issue?