ASPRSorg / LAS

LAS Specification
https://www.asprs.org/committee-general/laser-las-file-format-exchange-activities.html
138 stars 16 forks source link

Standardization of common extrabytes #37

Open esilvia opened 6 years ago

esilvia commented 6 years ago

We've discussed listing some standardized extrabytes either in the specification itself or a supplementary document. This would encourage their adoption by the community and formalize common extrabytes as a guideline to future implementations.

We need to figure out the following:

  1. Which extrabytes merit standardization?
  2. Which fields should be formalized? e.g., optional fields like min, max, and nodata might not make sense.
  3. Should data_type be formalized?
  4. Where will this list live? Will it formally be included in the specification itself (thereby requiring ASPRS approval every time one gets added), or perhaps as a wiki page on GitHub with a link from the specification? I propose the latter.
  5. What will be the approval process for new additions? (I propose people submit new Issues and then LWG votes yes/no).
  6. Should units be formalized? For example, will we have to have separate entries for "refracted depth" in meters and feet?

Below is a link to what I think is a decent start to the standardized extrabytes. Once we get some agreement on a few of these I can start building a wiki page or contribute to Martin's pull request. Which one we do depends on the answer to the 4th question.

Standard ExtraBytes v1.docx

rapidlasso commented 6 years ago

3.) I think for data_type we can make a "best" and "worst" practice recommendation that shows what scales and offsets use the fewest bytes possible to get reasonable resolution. BEST: For a return echo width stored with 0.1 ns resolution use an 0.1 scaled unsigned char to cover the range 0.0 ns to 25.5 ns or an 0.1 scaled unsigned short to cover the range 0.0 ns to 6553.5 ns. For a return echo width stored with 0.01 ns resolution use an 0.01 scaled unsigned short to cover the range 0.00 ns to 655.35 ns. WORST: Avoid using floats or doubles and email Martin if you really want to start a heated discussion about storing linear measurements using a floating-point format. (-;

lgraham-geocue commented 6 years ago

Other useful stuff under the "common" category. Not fleshed out but just introduced as ideas:

Group ID - this is a 32 bit (64 bit?) unsigned int that is used as a group index (or Object ID, OID). For example, all points that "belong" to a specific building roof identified as Object ID 246 will have this tag set to 246. Relationship maps can be built in a VLR, EVLR.

Sigma X, Y, Z --> Standard deviation of the point expressed in the units of the projection. For Geographic data, the units are meters. The values are doubles or they could just be long and follow the point scaling.

Normal - 3 tuple that defines the normal to the surface at the point of incidence. Direction is opposite the ray direction (toward the laser scanner).

rapidlasso commented 6 years ago

Two issues with the Normal that @lgraham-geocue suggests.

(1) Directions are always troublesome because they are difficult to re-project correctly when going from one CRS to another. In the PulseWaves format we've solved this by expressing directions vectors as two points. Re-projecting both is always going to be correct (even if we go to non-euclidean space). How about a "trajectory" index instead that reference "trajectory points" stored in the same LAS file (but marked synthetic) that are on the trajectory. These "trajectory points" are then given the same index so they can be paired up with the actual returns.

(2) Triplets have been deprecated.

esilvia commented 6 years ago

@rapidlasso What is there to be gained by allowing the data_type to vary for a given ExtraByte definition? I've noticed that you allow it to vary for the "height from ground" extrabyte in your tools, but that's caused my implementations a little trouble when some files have it defined one way while others have it defined another way.

I guess this begs the question of why we're standardizing. I believe it's to encourage implementation by more software vendors, which means simplification is key. In my opinion that means guaranteeing a 1-to-1 relationship of the key attributes with a certain EB code. At a minimum I think data_type, name, and nodata should be fixed, while description, scale, offset, and validity of min/max are recommended.

What do you think about releasing a series instead? e,g, "height from ground [cm]" with data_type int16 gets one Standard value (e.g., 200) while "height from ground [mm]" with data_type int32 gets the next value (e.g., 201)?

esilvia commented 6 years ago

@rapidlasso Good point about the difficulties with reprojection. I've had this struggle with the Origin vector of FWF data, and I've often wondered whether those vectors are getting modified correctly.

Unfortunately, if the points get shifted (e.g., from calibration) I doubt whether any software would also update the point coordinates. That's the advantage of the vector. As you point out, though, the disadvantage is that they're only valid for a given projection.

lgraham-geocue commented 6 years ago

Just always store the normals in ECEF.

rapidlasso commented 6 years ago

@esilvia, not feeling strongly about the data type issue. Your suggestion is also good as it would prohibit folks from storing "height above ground" or "echo width" as floating point numbers. Now that is something that I really do feel strongly about. How do I allow different data types in LASlib? I have a "get attribute as float value" function to use "extra bytes" for processing so the actual storage format of the extra attribute does not matter in my implementation.

rapidlasso commented 6 years ago

@esilvia and @lgraham-geocue my suggestion is to start this standardization with very few (two or three) additional attributes that are likely to be used or that are already used. "Height above ground" is an obvious candidate for derived (i.e. not new) information. "Echo width" is an obvious candidate for additional (i.e. new) information. I would recommend to start with just those two and see how it works out before adding a larger number ...

lgraham-geocue commented 6 years ago

Yes, I agree. The more complex, the lower the adoption rate. I would like to see Group added in this initial change. It is just an unsigned long (4 byte) or unsigned long long (8 byte). In the initial version, there would be no restrictions on its use other than initializing to zero (meaning no group membership). We could write a short "best practices" on using Group but it would only be a guideline, not a requirement.

esilvia commented 6 years ago

@lgraham-geocue I like the idea of a GroupID/ObjectID attribute. Should the NULL value be 0 or INTMAX? Not sure which is more intuitive.

What if there are two different kinds of groups that a point could belong to? Should we include recommendations for supporting multiple attributes of the same kind? e.g., GroupID[1], GroupID[2], etc?

esilvia commented 6 years ago

Any preference on how to differentiate between the 32-bit and 64-bit ObjectID definitions? LongObjectID for 64bit?

lgraham-geocue commented 6 years ago

IT may complicate it a bit. A simple 32 or 64 bit Group number would probably be a good start (If we have only one, I would prefer 64 bit).

esilvia commented 5 years ago

Here's an update to the proposed standard extrabytes. Standard ExtraBytes v2.docx

esilvia commented 5 years ago

And here's another update including some of the feedback I got this summer at JALBTCX, adding the horizontal and vertical uncertainty fields. Standard ExtraBytes v3.docx

rapidlasso commented 5 years ago

The "Range" which is "defined as the three-dimensional distance from the sensor to the point, the range is useful for multiple computations such as intensity attenuation and measurement bias." is suggested to be of data type float. I vehemently oppose that. The data type should be an unsigned integer (or even just an unsigned short) with a scale that is similar to that of the LAS points (or less precise) and an offset of zero.

rapidlasso commented 5 years ago

Are the tuples and triples finally deprecated? I'd like to completely remove them from LASlib. They never were properly supported and I've never seen them used anywhere.

rapidlasso commented 5 years ago

I suggest we start with one or two or three standardization that are reasonably simple. My votes go to:

lgraham-geocue commented 5 years ago

I have never encountered them being used. Maybe Howard (Butler) is using them for something? I think he was the one who advocated for these structures.

lgraham-geocue commented 5 years ago

Completely agree. In addition, all distance units in the file should be (we would say “must” in the spec) be in the vertical units of the Spatial Reference System of the file. I say vertical units because, in the USA, there are still some “official” SRS with horizontal in feet (INT, Survey?) and vertical in meters.

hobu commented 5 years ago

In addition, all distance units in the file should be (we would say “must” in the spec) be in the vertical units of the Spatial Reference System of the file. I say vertical units because, in the USA, there are still some “official” SRS with horizontal in feet (INT, Survey?) and vertical in meters.

LAS abdicates responsibility for the coordinate system by handing it off to WKT. I disagree that the specification should get involved here, because the spec and the SRS are inevitably going to get into conflict.

LAS should investigate requiring OGC WKT2 in a future revision. WKT2 handles more situations and is more complete. See https://gdalbarn.com/ for some discussion related to the GDAL project on the topic (thanks for the contribution @lgraham-geocue!)

(tuples and triples) (Maybe Howard (Butler) is using them for something?

Triplets are common in graphics scenarios, and I proposed them thinking they would be well aligned with LAS. They aren't, and they introduce as many problems as they might solve. Few softwares produce or consume them. They should be dropped. No one will miss their removal.

esilvia commented 5 years ago

@rapidlasso Tuples and triples will be officially dropped with the next revision (#1).

I agree that range could be confusing because of potential desynchronization with the SRS units, but I believe that fixing it at meters and leaving it with the points prevents its loss when the trajectory files inevitably get lost. We hard-code units for the angles (at degrees), so I don't see why we can't do this with Range. Software can easily change units displayed while leaving the units stored untouched.

You've persuaded me that starting with a small handful is a good idea, and I like Martin's list. I'm tempted to add the topobathy-related ones, but perhaps that's better left in the LDP?

lgraham-geocue commented 5 years ago

To be LAS 1.4, PDRF 6 compliant, a LAS file must have the vertical units encoded in the WKT. USGS has been rejecting data that do not have the units tag properly set. We are inviting a space probe collision error by allowing mixed linear, unlabeled units, especially for states who do not use meters for anything.

rapidlasso commented 5 years ago

@esilvia "Tuples and triples will be officially dropped with the next revision". Happy to hear that. I just kicked them out of LASlib last week ... (-:

rapidlasso commented 5 years ago

@lgraham-geocue I disagree. A range is - similar to the scan angle - something measured by a scientific measurement instrument and should follow international standards. I could see how your argument could apply to "height above ground" but even here I'm leaning to always making the measurement unit part of the standardized "extra bytes" because (1) the CRS often gets stripped, (2) reprojecting coordinates from a feet-based CRS to a meter-based CRS (or vice versa) without rescaling the "extra bytes" leads to wrong ranges / heights above ground, and (3) the best choices in scale and offset change when we go from meter to feet. A scale factor of 0.01 may be good when measuring the range or the height above ground for an airborne scan in meters but it is overly precise for feet. We will open a whole can of worms of "extra bytes" that do not have the correct unit or where we do not know the correct unit if we let the vertical unit of the CRS decide this.

esilvia commented 5 years ago

@rapidlasso You make a strong point regarding the scale/offset also being unit-dependent. I think that's also a strong argument in favor of fixing the units.

@lgraham-geocue observed that the horiziontal and vertical units can be different in LAS files, which is something I've also observed to my chagrin. Since Range is a 3D measurement, it could get very, very weird if the vertical units are meters and horizontal units are feet. I think this is another argument in favor of fixing the units at meters.

I can be persuaded that the height-from-ground will match the vertical units of the LAS file. Simple, and I think it's what people would expect when they receive data.

So here's the plan: I'm going to publish the following "Standard" extrabytes as a GitHub wiki page (https://github.com/ASPRSorg/LAS/wiki/Standard-ExtraByte-Definitions):

All of these Standard ExtraBytes will be assigned an integer value (ID) that can be assigned to the first two bytes of the ExtraByte definition structure (currently Reserved). It's a little longer than Martin's list but I think it captures the ones I've seen in the wild. I didn't get any feedback on incorporating the ExtraByte definitions from the topobathy LDP, so I decided to include the ones that I've seen most often.

Rather than include these definitions in the specification itself, I'll update the ExtraByte VLR description in the specification with a link to the wiki page and claim the two Reserved bytes for the ID field, which must be 0 unless it adheres to one of the definitions on the wiki page.

All of these changes will be included with the R14 revision, which I plan to submit to ASPRS in the next week or two. Last chance to comment. @rapidlasso @lgraham-geocue @csevcik01 @hobu @anayegandhi @jdnimetz

parrishOSU commented 5 years ago

Commenting on Evon’s 6 questions:

  1. Which extrabytes merit standardization? I generally concur with those Evon listed in the Word doc but view some as more important than others. For example, I see coordinate uncertainties as fundamental attributes that should accompany all points in all point clouds (and are required for bathymetric lidar data to be used in hydrographic surveying/nautical charting workflows), whereas NDVI would, I think, be useful mainly for point clouds covering vegetated areas and less applicable to a data set covering, say, a parking lot.
  2. Which fields should be formalized? e.g., optional fields like min, max, and nodata might not make sense. Probably all of them, but one that I see as particularly problematic is reflectance. Various groups have used differing definitions of the terms reflectance, normalized intensity, relative reflectance, and pseudo-reflectance. Maybe the best thing to do is include separate entries for each of these, publish generic definitions of each, and leave it to the group populating the field to pick which one makes most sense for the value they are computing. Then, maybe they could add a bit of metadata (or external documentation) somewhere that explains exactly how they compute pseudo-reflectance or normalized intensity, or whatever. Or, we could adopt the processing levels (Levels 0-3) defined in Kashani et al., 2015.
  3. Should data_type be formalized? I see that others have already commented on this and am not sure I have much to add.
  4. Where will this list live? Will it formally be included in the specification itself (thereby requiring ASPRS approval every time one gets added), or perhaps as a wiki page on GitHub with a link from the specification? I propose the latter. I agree: this is too far into the technical details to merit an ASPRS Board review every time a change is made; it should just reside on GitHub.
  5. What will be the approval process for new additions? (I propose people submit new Issues and then LWG votes yes/no). Seems reasonable.
  6. Should units be formalized? For example, will we have to have separate entries for "refracted depth" in meters and feet? Well, if we have separate entries for meters and feet, then we actually need 3 entries for all values with distance units: 1) meters, 2) U.S. Survey Feet, and 3) International Feet.
rapidlasso commented 5 years ago

@esilvia and @lgraham-geocue after more thinking I have changed my mind to be more firm on the unit issue: measurements stored in "extra bytes" should have a fixed unit of measurement. Hence, both "range" and "height above ground" should have a unit that is independent from any other information in the LAS file. The premise of "extra bytes" is that you can "safely ignore them" if you do not care about exploiting the additional information. What makes their addition so powerful is that older already existing software - or newer software yet to be written - can choose to support them ... or not. By making the "height above ground" unit dependent on the CRS we break this premise. We would force software to continuously update itself so that it always interprets and - if needed - changes the "extra bytes" when it makes changes elsewhere. And we would assure that older software produces LAS file that are incorrect. This is simply not acceptable.

hobu commented 5 years ago

By making the "height above ground" unit dependent on the CRS we break this premise.

Either the LAS specification must assume responsibility for all coordinate system metadata, including precision, units, orientation, and transforms, or it must unburden itself by requiring those items to be defined in the coordinate system description defined by a specification such as OGC WKT or WKTv2. The WKT specifications are quite long, and there are many edge cases to consider. IMO it is best to let those who have spent a lot of time thinking about all of those issues provide direction on how software should implement it.

I recognize that "coordinate"-related storage of HAG in the extra bytes is kind of special here. It gets messy though. What does HAG mean if the file is stored in a geocentric coordinate system? It is quite common for the horizontal and vertical units of coordinate systems to not match or not even be the same system (think geographic data with orthometric heights). I don't have good solutions, but I'm cautioning against us assuming responsibility because it is a tarpit.

esilvia commented 5 years ago

From @lgraham-geocue here: https://github.com/orgs/ASPRSorg/teams/lwg/discussions/3/comments/1

They need to refer back to the WKT (which can be dangerous since no one seems to be able to get this exactly right) or have them explicitly encoded in the Extra Bytes. I think it would be a very poor design to just say they are always meters or some other fixed unit.

Not sure how (or maybe this is in the latest design) but it should be value, units, ….

What about scaling, offset? Are all values absolute or do they rely on LAS scaling, offset (this is rhetorical – I can look at it myself when I have time!)

This is a really big deal and we have to get it right. Spatial Reference System encoding is probably the primary rejecting issue from USGS right now on 3DEP. We definitely need input from USGS (@jdnimetz) prior to submission.

vminor commented 5 years ago

If you are talking about WKT, make sure it's the latest updated version (also see ISO 19162). There is an OGC description here:

http://docs.opengeospatial.org/is/12-063r5/12-063r5.html

Another alternative may be to embed GML coordinate system descriptions:

http://www.opengeospatial.org/standards/gml

Original WKT string definitions have problems representing Vertical Coordinate Reference systems correctly, and should be avoided if possible. This issue was (mostly) fixed with the ISO 19162 update (WKT2). GML has a three dimensional CRS that will completely represent this metadata as well. In addition, both WKT2 and GML description models have devices for handling temporal references. WKT (original) does not.

On Tue, Oct 2, 2018 at 12:47 PM, Evon Silvia notifications@github.com wrote:

From @lgraham-geocue https://github.com/lgraham-geocue here: https://github.com/orgs/ASPRSorg/teams/lwg/discussions/3/comments/1

They need to refer back to the WKT (which can be dangerous since no one seems to be able to get this exactly right) or have them explicitly encoded in the Extra Bytes. I think it would be a very poor design to just say they are always meters or some other fixed unit.

Not sure how (or maybe this is in the latest design) but it should be value, units, ….

What about scaling, offset? Are all values absolute or do they rely on LAS scaling, offset (this is rhetorical – I can look at it myself when I have time!)

This is a really big deal and we have to get it right. Spatial Reference System encoding is probably the primary rejecting issue from USGS right now on 3DEP. We definitely need input from USGS (@jdnimetz https://github.com/jdnimetz) prior to submission.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/ASPRSorg/LAS/issues/37#issuecomment-426346799, or mute the thread https://github.com/notifications/unsubscribe-auth/AiiGRUdSEgLyQElIy4kbfWjJ3mWfUyuTks5ug5iagaJpZM4Pfn3m .

--

Best Regards,

Victor Minor Blue Marble Geographics www.bluemarblegeo.com http://www.bluemarblegeo.com/?utm_source=signature&utm_medium=email&utm_campaign=website

hobu commented 5 years ago

Original WKT string definitions have problems representing Vertical Coordinate Reference systems correctly, and should be avoided if possible. This issue was (mostly) fixed with the ISO 19162 update (WKT2). GML has a three dimensional CRS that will completely represent this metadata as well. In addition, both WKT2 and GML description models have devices for handling temporal references. WKT (original) does not.

yes, but it would be miserable to require WKTv2 in a simple revision document. This needs to be in a minor release. Also, I don't think WKTv2 would be very convenient for many folks, especially if they're using software based on GDAL and PROJ. See https://gdalbarn.com for some background.

IMO we should stay away from GML.

ogcscotts commented 5 years ago

OGC does have a policy that all OGC standards developed starting in 2017 use WKT2. Given the requirement for correct use of vertical datums, WKT2 is a better choice... even if not as widely implemented. WKT2 was a major factor in some large contributions to the barn raising.

hobu commented 5 years ago

@ogcscotts Yes, I agree, but I don't think we can increment an implementation requirement in what is essentially a errata update (1.4 R14). We need to make a 1.5 to impose a WKTv2 requirement.

rapidlasso commented 5 years ago

This discussion is getting off-topic. Can we agree that - for the aforementioned reasons - the units of "extra bytes" must be "hard coded"? We cannot expect applications that merely change the CRS of the points to also to correspondingly modify the units of all optional "extra bytes" payloads for the file to remain valid. This would put every existing software immediately into violation of the specification as soon as such a revision was released. Older software would forever produce invalid files. And the rules for doing so in newer software would be insanely complex. I suggest we use metric units: Meter, Seconds, Nanoseconds, Grams, Kilograms ... . Should you need Feet, Centidays, Myriaseconds, Ounzes or Pounds ... the conversion formulas are really simple. We can add those to the spec if needed.

kjwaters commented 5 years ago

Would it not make sense to use a VLR to describe the units for the extra bytes?

rapidlasso commented 5 years ago

The "extra bytes" are already described by the "extra bytes" VLR that has a special user ID and record ID. But you really might be onto something with this thought. There indeed happens to be an opportunity here. We could use the unused space that becomes available with the deprecation of tuples and triples to add something new that is valid for any of the new "standardized extra bytes" and that could be used to describes the units and more.

lgraham-geocue commented 5 years ago

The units must be specified in the Extra Bytes or be tied to the SRS of the file. If I use software to transform vertical units from/to Feet(two types)/meters, I will have an expectation that heights in the Extra bytes were also transformed (users do not know about or care about Extra Bytes). Having anything assumed such as units assumed by their type will probably lead to a lot of errors.

lgraham-geocue commented 5 years ago

Again, I strongly recommend that we get USGS involved in any attempt to change SRS encoding in LAS. This has been a nightmare for 1.4 and still is not sorted.

rapidlasso commented 5 years ago

@lgraham-geocue, this is exactly how "extra bytes" was intended to work. Users do not need to know about their existence. And with users I also mean software packages. The "extra bytes" were meant to enhance the LAS format for those who care without complicating the LAS format for others who just want to use the standard LAS points. Your suggestion to tie it to the SRS would require all software to get continuously retrofitted to make sure they know all "extra bytes" and change them if needed. I don't think that is what you want.

I think specifying the units in the "extra bytes" VLR using the space made available by the no longer supported tuples and triples (see the exchange with @kjwaters above) will achieve what you want. This allows your software - that cares about "height above ground" for example - to change the units from "meter" to "US survey feet" or vice-versa when it projects the LAS point coordinates. Some other software - that does not care or not know about the "height above ground" attribute - can still do a correct point coordinates projection, yet without touching the "height above ground" units. But that is okay because the units are specified.

However, if we go this route we may need more time to design the reuse of the space available in the "extra bytes" VLR for proper "unit" handling. Let's maybe do this revision without adding standardized ones and just deprecate tuples and triples?

lgraham-geocue commented 5 years ago

Sounds very good to me. I think having the units with the value (i.e. in the Extra Bytes) is significantly better than an indirect reference (that is, back to the WKT).

hobu commented 5 years ago

I think having the units with the value (i.e. in the Extra Bytes) is significantly better than an indirect reference (that is, back to the WKT).

As long as we are aware that consequences of this choice include:

rapidlasso commented 5 years ago

My take on those consequences:

hobu commented 5 years ago

As I said up thread, my unit nit in regards to WKT is about measurements in relation to position like HAG. The sentiment here seems to be that every attribute have explicit units from a units dictionary rather than implicit units like many items currently have. Maybe that is something to address in a major revision.

rapidlasso commented 5 years ago

Are we really just talking about where to store whether the "height above ground" is expressed in "feet", in "survey feet", or in "meters"? Or is there more to it? I attach a helpful graphic to support my intention to adopt fixed metric units, but I could be cornered and convinced to code the units into the "extra bytes VLR" metric_world

lgraham-geocue commented 5 years ago

I apologize on behalf of the USA for being in the stone age when it comes our measurement systems (seriously!!)…..

hobu commented 5 years ago

haha, but it isn't always just imperial unit challenges. Anyway, we are very side-tracked now. I think we should address explicit measurement units of all dimensions in a future revision of the specification in with a more holistic approach. I'm not going to stand in the way of an agreement @rapidlasso and @lgraham-geocue.

esilvia commented 5 years ago

@rapidlasso love the graphic!

The idea of encoding units into the extrabyte itself is intriguing, but I agree that's probably outside the scope of this issue.

As has been covered in the last few posts, the whole idea behind ExtraBytes is that you can ignore them if you don't care about them. If you do care, you probably already know what units they're in. Perhaps the best option is for the wiki to remain silent on units when they're potentially contentious, or provide a recommendation, rather than a requirement. The Standard ExtraBytes wiki is supposed to be a guideline anyway.

rapidlasso commented 5 years ago

The "extra space" could be used to describe two alternate units as follows. After deprecating "tuples" and "triples" we could reuse the array entires [1] and [2] of no_data, min, max, scale, and offset. Below my "old" struct of 192 bytes that is the payload of the official "extra bytes" VLR.

struct LASattribute
{
  U8 reserved[2];           // 2 bytes
  U8 data_type;             // 1 byte
  U8 options;               // 1 byte
  CHAR name[32];            // 32 bytes
  U8 unused[4];             // 4 bytes
  F64 no_data[3];           // 24 = 3*8 bytes // last 16 bytes deprecated
  F64 min[3];               // 24 = 3*8 bytes // last 16 bytes deprecated
  F64 max[3];               // 24 = 3*8 bytes // last 16 bytes deprecated
  F64 scale[3];             // 24 = 3*8 bytes // last 16 bytes deprecated
  F64 offset[3];            // 24 = 3*8 bytes // last 16 bytes deprecated
  CHAR description[32];     // 32 bytes
};

and - just to present a tangible example - we could re-use it as shown below to offer two alternate units assuming the conversion can be done by changing the scale and the offset.

struct LASattribute
{
  U8 reserved[2];           // 2 bytes
  U8 data_type;             // 1 byte
  U8 options;               // 1 byte
  CHAR name[32];            // 32 bytes
  U8 unused[4];             // 4 bytes
  F64 no_data;              // 8 = 1*8 bytes
  U8 unit                   // 1 byte
  CHAR unit_name[15];       // 15 bytes
  F64 min;                  // 8 = 1*8 bytes
  U8 alt_unit1              // 1 byte
  CHAR alt_unit1_name[15];  // 15 bytes
  F64 max;                  // 8 = 1*8 bytes
  U8 alt_unit2              // 1 byte
  CHAR alt_unit2_name[15];  // 15 bytes
  F64 scale;                // 8 = 1*8 bytes
  F64 alt_unit1_scale;      // 8 = 1*8 bytes
  F64 alt_unit2_scale;      // 8 = 1*8 bytes
  F64 offset;               // 8 = 1*8 bytes
  F64 alt_unit1_offset;     // 8 = 1*8 bytes
  F64 alt_unit2_offset;     // 8 = 1*8 bytes
  CHAR description[32];     // 32 bytes
};
esilvia commented 5 years ago

Ah, I see what you mean about the unused payload. I'm going to spin off a new issue from that post. You're making a good case for a new ExtraByte VLR altogether.

esilvia commented 5 years ago

See the initial implementation on the wiki here: https://github.com/ASPRSorg/LAS/wiki/Standard-ExtraByte-Definitions

Does this work for everyone? If so, I'll edit the specification with a link to it.