Closed jfbourgon closed 5 years ago
Hi @jfbourgon! A good question! We already had this problem working with pointclouds and in fact we actually don't use vertical units, so you can just omit unsupported parameters.
+proj=utm +zone=18 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"
Let me know if that worked for you!
I'm actually getting such issue when I try to access the extent
property of a geotrellis.pointcloud.spark.io.hadoop.HadoopPointCloudHeader
instance.
In this context, I don't have much control on the actual value of the proj4 string itself. However, I noticed that I may have some control over it by altering my current GDAL/proj4 configuration files.
For instance I got rid of the +geoidgrids
parameter by commenting out the following entry in the vertcs.override.csv
file referenced by my GDAL_DATA environment variable.
#6647,CGVD2013 height,1127,Canadian Geodetic Vertical Datum of 2013,9001,1,0,6499,9665,CGG2013n83a.gtx
@jfbourgon very glad to her that you are using pointclouds with geotrellis! PDAL allows to specify input crs for the input data: https://github.com/geotrellis/geotrellis-pointcloud-demo/blob/master/src/app-backend/ingest/src/main/scala/com/azavea/pointcloud/ingest/Ingest.scala#L31-L52
You can define pipeline for HadoopPointCloudRDD
:
val pipeline = Read("", "inputCrs")
Thanks! I was able to properly read extent from header using the suggested workaround.
Here are the steps I used:
val pipeline = Read("", Option("+proj=utm +zone=18 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"))
val rdd = HadoopPointCloudRDD("/gpfs/dev/VILLE_MONTREAL/VILLE_MONTREAL/18_E_5_52/POINT_CLOUD/292-5048_2015_2-5-6.las", HadoopPointCloudRDD.Options.DEFAULT.copy(pipeline = pipeline))(sc)
val header = rdd.first._1
header.extent
res0: geotrellis.vector.Extent = Extent(604226.62, 5047340.41, 605244.4, 5048358.67)
header.extent3D
res1: geotrellis.pointcloud.spark.Extent3D = Extent3D(604226.62,5047340.41,10.39565,605244.4,5048358.67,52.11369)
However, it would be nice to get complete support for vertical keywords in proj4j some day.
This issue should be moved to Proj4j once it gets up and running with LocationTech
It's a proj4 issue now:https://github.com/locationtech/proj4j/issues/20
Hi, Got the same problem trying to use:
val las = spark.read.format("geotrellis.pointcloud.spark.datasource").option("path","hdfs:///user/guiet/test_geotrellis/USGS_LPC_LA_Barataria_2013_15RYN6548_LAS_2015.las").load
Is there a workaround for my case?
fun convertLatLonAltToUtm( latitude: Double, longitude: Double, altitude: Double ): ProjCoordinate? { val input = ProjCoordinate(longitude, latitude, altitude) val result = ProjCoordinate() val crsFactory = CRSFactory() val WGS84 = crsFactory.createFromParameters( "WGS84", "+proj=longlat +datum=WGS84 +no_defs" ) val UTM = crsFactory.createFromParameters( "UTM", "+proj=utm +zone=32 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +vunits=m +no_defs" ) val ctFactory = CoordinateTransformFactory() val wgsToUtm = ctFactory.createTransform(WGS84, UTM) wgsToUtm.transform(input, result) return result } getting this vunits parameter is not supported do you know why?
Hey @AnkitDev21, yes, the underlying library does not support it: https://github.com/locationtech/proj4j/issues/20
Trying to parse a proj4 string such as the following:
will raise
org.osgeo.proj4j.UnsupportedParameterException
or
Indeed such parameters are not listed in SupportedParameter TreeSet in Proj4Keyword.java