PermafrostDiscoveryGateway / viz-points

Python package for post-processing point-cloud data for 3D visualization
Apache License 2.0
0 stars 1 forks source link

Point cloud shows up significantly above or below terrain surface #14

Closed iannesbitt closed 1 year ago

iannesbitt commented 1 year ago

In certain areas, point clouds are not well aligned in Z with whatever terrain model is being used. The point cloud ground surface ends up meters above or below the draped ground surface in Cesium but located ok in X and Y. Having points above the ground surface wouldn't be as much of a problem if the ground surface weren't such a vital part of the navigation mechanics of the Cesium js viewer. Having points below ground fully hides them from view of the user.

This happens with all coordinate systems tried so far, whether reprojected in PDAL, reprojected in LAStools, or untransformed and projected as-is in Cesium. The only solution so far is a manual transform of Z in LAStools, which is unfeasible on a large scale.

Guiding questions so far:

Below are examples.

Seen here is the Lake Placid test dataset (derived from an officially released USGS dataset) a few tens of meters above the NED-derived ground surface: Lake Placid test dataset

Below, an example of terrain off and terrain on in which the road surface from a sample dataset in Nome, AK ends up beneath both the NED and ArcticDEM terrain models when terrain is enabled: terrain off terrain on

Finally, a dataset from Huntsville, AL hovers 10 meters or more above the NED model: image

iannesbitt commented 1 year ago

Temporary solution is to adjust per-region using an adjustment factor. Here is Nome, AK data with +9 m Z translation: image

iannesbitt commented 1 year ago

A solution to draw points that exist below the terrain:

viewer.scene.globe.depthTestAgainstTerrain = false;
iannesbitt commented 1 year ago

Adjusting the Z values to accommodate the geoid height seems to remedy the issue at least for the test dataset in Lake Placid: image

iannesbitt commented 1 year ago

Geoid height API from NOAA: https://geodesy.noaa.gov/web_services/geoid.shtml

Usage example for CONUS: https://geodesy.noaa.gov/api/geoid/ght?lat=44.257&lon=-73.965 for Alaska: https://geodesy.noaa.gov/api/geoid/ght?lat=64.506&lon=-165.399&model=13

iannesbitt commented 1 year ago

This does not work as well in Nome, where the geoid height is ~5.2m but the adjustment that brings the datasets in line with Cesium terrain is more like 7-8m

iannesbitt commented 1 year ago

Here is a visual representation of what I'm doing to correct the geoid height (in the LAS files) to ellipsoid height (used by Cesium's terrain model). Cesium uses h to define its terrain whereas the LAS datasets are using H. The difference between the ellipsoid and geoid is N. When I add H+N I get something that much more closely resembles the Cesium topography. image I think it may be worthwhile to add in a geoid height adjustment feature that uses the NOAA API to get N from one of their models and applies it as a Z translation to approximate ellipsoid height. I still need to try to understand if there's a way to tell if a dataset has its height values based on the geoid or the ellipsoid, but the adjustment itself shouldn't be hard to code.

iannesbitt commented 1 year ago

This service may be of use if NAVD88 is the vertical datum we have to correct for: https://www.ngs.noaa.gov/web_services/ncat/lat-long-height-service.shtml

iannesbitt commented 1 year ago

Emmanuel has confirmed that the flight line elevation values were converted to geoid height (EGM2008) during an early stage of post-processing. I'm not sure LAStools has any built-in capability to convert from EGM height to ellipsoid height, so adjustment built into the pipeline may have to be from third party software (perhaps pygeodesy). Luckily for most datasets I think the adjustment value can be applied as a fixed value (i.e. N can be computed and applied per-file and not per-point)

At the moment the way QT Modeler attaches geoid information to LAS files is by using compound coordinate system well-known text strings. An example is given below. Currently viz-points is unable to parse COMPD_CS WKT strings, as it just grabs the last EPSG code from the string, rather than the code from the AUTHORITY field of the PROJCS. I will make an issue to deal with this shortly.

COMPD_CS["WGS 84 / UTM zone 3N + EGM2008 height",PROJCS["WGS 84 / UTM zone 3N",GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]],PROJECTION["Transverse_Mercator"],PARAMETER["latitude_of_origin",0],PARAMETER["central_meridian",-165],PARAMETER["scale_factor",0.9996],PARAMETER["false_easting",500000],PARAMETER["false_northing",0],UNIT["metre",1,AUTHORITY["EPSG","9001"]],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","32603"]],VERT_CS["EGM2008 height",VERT_DATUM["EGM2008 geoid",2005,AUTHORITY["EPSG","1027"]],UNIT["metre",1,AUTHORITY["EPSG","9001"]],AXIS["Gravity-related height",UP],AUTHORITY["EPSG","3855"]]]
iannesbitt commented 1 year ago

The use of EGM2008 versus GEOID12B explains most of the extra ~2 m of difference I was seeing between the Cesium terrain and the points in the Nome datasets. I was using +5.2 as an adjustment but I needed to be using +6.99. So I think I'm comfortable saying this issue is "known" now.

iannesbitt commented 1 year ago

After spending a frustrating amount of time trying to find tools to parse COMPD_CS CRS objects (neither geodaisy nor PyCRS would recognize valid COMPD_CS WKT strings), I've found that (of course) pyproj handles them natively. This solves #24. I think the reason I couldn't find anything when searching for info initially is that they are fairly rarely used and a relatively new invention of the USGS.

iannesbitt commented 1 year ago

See relevant pyproj docs

iannesbitt commented 1 year ago

Similar to this comment which describes NGS API for querying GEOIDXX models, here is a query to VDatum for EGM2008: https://vdatum.noaa.gov/vdatumweb/api/convert?s_x=-165.399282&s_y=64.506181&s_h_frame=WGS84_G1674&s_v_geoid=EGM2008&s_v_frame=EGM2008&t_v_frame=WGS84_G1674&region=ak&t_v_geoid=EGM2008&t_h_frame=WGS84_G1674

I used the following string format to construct this API call:

VDATUM_URL = 'https://vdatum.noaa.gov/vdatumweb/api/convert?s_x=%s&s_y=%s&s_h_frame=%s&s_v_frame=%s&s_v_geoid=%s&t_h_frame=%s&t_v_frame=%s&t_v_geoid=%s&region=%s'

I still need to work on converting EPSG codes derived from the WKT to geoid and reference frame names (e.g. EGM2008, WGS84_G1674)

iannesbitt commented 1 year ago

EPSG code / reference frame name extractor and API calls are set up. Now I need to construct a way to extract mean X and Y coordinates from LAS files and convert them to lat/lon in order to submit them to the APIs.

iannesbitt commented 1 year ago

Finished a working version of pyegt which allows for easy geoid height lookups via one of two NOAA APIs. This will be incorporated into viz-points to be able to convert points to ellipsoid-relative height when necessary.

iannesbitt commented 1 year ago

This issue is close to being resolved; code is now fully written but issue #23 needs to be resolved in order for testing to take place

iannesbitt commented 1 year ago

Tested and working