connormanning / entwine

Entwine - point cloud organization for massive datasets
https://entwine.io
Other
445 stars 129 forks source link

specify CRS to convert a LAS to EPT #269

Closed julienlau closed 2 years ago

julienlau commented 2 years ago

Thanks for sharing your work.

I managed to generate an ept from las without specifying the CRS (no option '-r')

However using the command: docker run -it -v "$PWD":/entwine connormanning/entwine:2.2.0 build -i /entwine/test.las -o /entwine/test-ept -r EPSG:4326+3857

I get the following error ;

1/1: /entwine/test.las
terminate called after throwing an instance of 'nlohmann::detail::out_of_range'
  what():  [json.exception.out_of_range.403] key 'bounds' not found

same error if I specify another CRS like -r EPSG:4326

If I try with entwine 2.1, I got a different error :

1/1: /entwine/test.las
Exception in pool task: During /entwine/test.las: filters.reprojection: source data has no spatial reference and none is specified with the 'in_srs' option.
Encountered an error: No points found!
connormanning commented 2 years ago

Note that -r not only specifies the SRS, but also specifies a reprojection to the target SRS. And to reproject data from one SRS to another, the source SRS must be known. This is as opposed to --srs which simply sets an SRS in the output without reprojecting the points. In your case, it looks like the data itself does not have an SRS embedded in the file, so you cannot reproject the data without specifying the source data SRS e.g. -r EPSG:XXXX EPSG:3857.

So if I understand your issue correctly, you must do one of the following depending on the behavior you are looking for:

julienlau commented 2 years ago

Thanks for the help, I did find it but it was not so clear to me.


docker run -it -v "$PWD":/entwine connormanning/entwine:2.2.0 build -h

Usage: entwine build (<options>)

    --input, -i
        File paths or directory entries.  For a recursive directory search, the
        notation is "directory**".  May also be the path to an `entwine scan`
        output file.

        Example: -i path.laz, -i pointclouds/, -i autzen/ept-scan.json

    --output, -o
        Output directory.

        Example: --output ~/entwine/autzen

    --config, -c
        A configuration file.  Subsequent options will override configuration
        file parameters, so it may be used for templating common options among
        multiple runs.

        Example: --config template.json -i in.laz -o out

    --tmp, -a
        Directory for entwine-generated temporary files

        Example: --tmp /tmp/entwine

    --srs
        Set the `srs` metadata entry of the output.  If reprojecting, this value
        will be set automatically from the output projection.  Typically this
        value is automatically inferred from the files themselves.

    --reprojection, -r
        Set the SRS reprojection.  The input SRS may be omitted to use values
        from the file headers.  By default, SRS values found in file headers will
        override the input SRS.  To always use the input SRS regardless of file
        headers, see the --hammer option

        Example: --reprojection EPSG:3857, -r EPSG:26915 EPSG:3857

    --hammer, -h
        If set, the user-supplied input SRS (see --reprojection) will always
        override any SRS found in file headers.  An input SRS is required if this
        option is set.

        Example: --reprojection EPSG:26915 EPSG:3857 --hammer

    --threads, -t
        The number of threads.

        Example: --threads 12

    --force, -f
        Force build overwrite - do not continue a previous build that may exist
        at this output location.

    --dataType
        Data type for serialized point cloud data.  Valid values are "laszip",
        "zstandard", or "binary".  Default: "laszip".

        Example: --dataType binary

    --span
        Number of voxels in each spatial dimension for data nodes.  For example,
        a span of 256 will result in a cube of 256*256*256 resolution.  Default:
        256.

        Example: --span 128

    --noOriginId
        If present, an OriginId dimension tracking points to their original
        source files will *not* be inserted.

    --bounds, -b
        XYZ bounds specification beyond which points will be discarded.  Format
        is [xmin, ymin, zmin, xmax, ymax, zmax].

        Example: --bounds 0 0 0 100 100 100, -b "[0,0,0,100,100,100]"

    --deep
        Read all points during file analysis rather than just the headers.

    --absolute
        If set, absolutely positioned XYZ coordinates will be used instead of
        scaled values

    --scale
        The scale factor for spatial coordinates.

        Example: --scale 0.1, --scale "[0.1, 0.1, 0.025]"

    --limit
        Maximum number of files to insert - the build may be continued with
        another `build` invocation.

        Example: --limit 20

    --subset, -s
        A partial task specification for this build.

        Example: --subset 1 4

    --maxNodeSize
        Maximum number of points in a node before an overflow is attempted.

    --minNodeSize
        Minimum number of overflowed points to be retained in a node before
        overflowing into a new node.

    --cacheSize
        Number of nodes to cache in memory before serializing to the output.

    --hierarchyStep
        Hierarchy step size - recommended to be set for testing only as entwine
        will determine it heuristically.

    --sleepCount
        Count (per-thread) after which idle nodes are serialized.

    --progress
        Interval in seconds at which to log build stats.  0 for no logging
        (default: 10).

    --profile, -p
        Specify AWS user profile, if not default

        Example: --profile john

    --sse
        Enable AWS server-side encryption

    --requester-pays
        Set the requester-pays flag to S3

    --allow-instance-profile
        Allow EC2 instance profile use for S3 backends

    --verbose, -v
        Enable developer-level verbosity
julienlau commented 2 years ago

anyway, the error message may be clearer with version 2.2 !