connormanning / entwine

Entwine - point cloud organization for massive datasets
https://entwine.io
Other
451 stars 128 forks source link

entwine put no data into the structure #260

Closed JeltoBuurman closed 1 year ago

JeltoBuurman commented 3 years ago

Hello, I have a big dataset 100 MB, When I run entwine in conda or from osgeo4w, then I get the following output, but no data in the structure:

entwine build -i f:\daten\Aichach\Laserpunkte -o f:\Daten\Aichach\ept
Scanning input
Resolving [file]: f:\daten\Aichach\Laserpunkte/* ...
        Resolved to 3 paths.
1/1: f:/daten/Aichach/Laserpunkte/iac.las
Exception in pool task: During f:/daten/Aichach/Laserpunkte/iac.las: Invalid scale f:/daten/Aichach/Laserpunkte/iac.las: [7.4e-323,0.0,0.0]
Determined SRS from an input file

Entwine Version: 2.1.0
EPT Version: 1.0.0
Input:
        File: f:/daten/Aichach/Laserpunkte/iac.las
        Total points: 52,564,351
        Density estimate (per square unit): 13.1411
        Threads: [1, 7]
Output:
        Path: f:\Daten\Aichach\ept/
        Data type: laszip
        Hierarchy type: json
        Sleep count: 2,097,152
        Scale: 0.01
        Offset: (626500, 5330000, 593)
Metadata:
        SRS: EPSG:25832
        Bounds: [(625999, 5327999, 568), (627001, 5332001, 618)]
        Cube: [(624498, 5327998, -1409), (628502, 5332002, 2595)]
        Storing dimensions: [
                X:int32, Y:int32, Z:int32, OriginId:uint32
        ]
Build parameters:
        Span: 128
        Resolution 2D: 128 * 128 = 16,384
        Resolution 3D: 128 * 128 * 128 = 2,097,152
        Maximum node size: 65,536
        Minimum node size: 16,384
        Cache size: 64

Adding 0 - f:/daten/Aichach/Laserpunkte/iac.las
        Pushes complete - joining...

Then I start in windows Sandbox following the commands of the quickstart of the Webside entwine.io I got:


(entwine) C:\Users\WDAGUtilityAccount>entwine build -i https://data.entwine.io/red-rocks.laz -o ~/entwine/red-rocks
Scanning input
1/1: https://data.entwine.io/red-rocks.laz
Exception in pool task: During https://data.entwine.io/red-rocks.laz: Invalid scale https://data.entwine.io/red-rocks.laz: [7.4e-323,9.5190790656e-312,0.0]
Determined SRS from an input file
Maximal extents: [(-2147483648, -2147483648, -2147483648), (2147483647, 2147483647, 2147483647)]
Scaled bounds:   [(-nan(ind), -2147483648, -2147483648), (-nan(ind), -2147483648, -2147483648)]
Encountered an error: Bounds are too large for the selected scale
Exiting.

Has anyone a helpfull advice

wolfsnipes commented 3 years ago

I started having this issue as well, installed from conda. It worked like two months ago. Errors are gone when reverting back to the 2.0.0 conda package.

JeltoBuurman commented 3 years ago

How can I get this package?

wolfsnipes commented 3 years ago

@JeltoBuurman you have to install the specific version of entwine. I think "conda install entwine=2.0.0" will do it. You may need to separate the environment creation and build steps that are included in the one-liner in the quickstart, and I don't use conda for anything else so don't remember 100%.

JeltoBuurman commented 3 years ago

Thank you very much, now it works too.