Esri / arcgis-pbf

Apache License 2.0
20 stars 4 forks source link

Request for additional FeatureCollection documentation #2

Open thw0rted opened 3 years ago

thw0rted commented 3 years ago

Thanks for posting this project! I can see that it was brought over from an internal tracker (there's still at least one link to devtopia.esri.com in the README) so I thought some of the definition might be in-flux or subject to some cleanup. My question is about the FeatureCollection type.

Calling the /query endpoint (without countOnly/idsOnly) returns a QueryResult with a FeatureResult. The FeatureResult can have an array of Features, each of which pairs an array of Valuess (attributes) with a Geometry. But FeatureResult can also have one FeatureCollection, which has an array of attributes and one of coordinates. Despite the name of the repository, the docs don't actually describe FeatureCollection. Is it actually used? If so, when? How would one break up the array of attributes or coordinates into individual Features?

thw0rted commented 3 years ago

I found another related issue. It looks like each Feature has a Geometry, and each Geometry can have a GeometryType. But in the result I got from the sample data I'm looking at (from https://services.arcgis.com/V6ZHFr6zdgNZuVG0/ArcGIS/rest/services/buildings_frankfurt/FeatureServer/0), it looks like the FeatureResult has a GeometryType defined (correctly), while the FeatureResult > Feature > Geometry > GeometryType is not defined, and thus gives the default value of 0 (meaning esriGeometryPoint).

I don't really have experience administering a server, but I thought that generally a Feature Service ("buildings_frankfurt" above) can only host one type of geometry. Is mixed geometry possible? If not, why is there a Feature.Geometry.GeometryType field at all? If so, how can I tell when a Feature.Geometry.GeometryType isn't specified, and thus fall back on FeatureResult.GeometryType?

mmgeorge commented 3 years ago

Sorry for the late response @thw0rted! Thought I was subscribed to this repo 😅

Is mixed geometry possible? If not, why is there a Feature.Geometry.GeometryType field at all

Mixed geometry is not possible in this case. Let me follow-up with that. In the JSAPI parser we don't actually parse this field at all. (We don't release that parser directly because it's fairly difficult to use and very specific to our internal workflows, basically we manually iterate over the pbf payload to build an offset table and use that to directly read out values lazily, overkill for most use cases but needed on the API side to avoid per-feature allocations. Also avoid parsing anything we don't explicitly use).

For decoding the fields, this might help. This is what the payloads look side-by-side when parsed literally into JSON:

literally_parsed

We can see that the message originating from PBF is more structured, and to get to the actual features, you needed to access queryResult.featureResult.features (whereas in f=pbf the featureResult is an anonymous object). "FeatureCollection" you can basically think of as the namespace for all the f=pbf formats that may be supported by feature services in the future. Currently only qureyResult.featureResult is supported.

Opening up a single feature we get: first-feature

At this point you are probably asking how we know that {"uIntValue": 93} maps to ObjectId and {"sintValue": 525} maps to Population. Basically the contract here is that the order of the field values returned by the server matches the order of the values returned for each feature. We need to make this clearer in the doc, but if we go to the fields:

field-order

We can see that the first field is objectId, and the second is population.

thw0rted commented 3 years ago

Thanks for replying @mmgeorge . It sounds like some of the fields in the PBF definition aren't actually used. Would it make sense to trim them out of the definition? Or maybe just update the docs to note that the fields are deprecated / legacy / reserved / whatever?

Did you see my question about applying the transform over at https://github.com/Esri/arcgis-rest-js/issues/702 ? I had to put this on the back burner since then, because I just can't seem to figure out how to convert the integer coordinates to decimal degrees (4326) correctly. I noticed that in your f=json example above, there's a transform defined as well. I don't see that in the sample I grabbed.

For comparison, this is https://services.arcgis.com/V6ZHFr6zdgNZuVG0/ArcGIS/rest/services/buildings_frankfurt/FeatureServer/0/query?where=&objectIds=2 in both formats:

f=json:

{
  "objectIdFieldName" : "OBJECTID", 
  "uniqueIdField" : 
  {
    "name" : "OBJECTID", 
    "isSystemMaintained" : true
  }, 
  "globalIdFieldName" : "", 
  "geometryProperties" : 
  {
    "shapeAreaFieldName" : "Shape__Area", 
    "shapeLengthFieldName" : "Shape__Length", 
    "units" : "esriDecimalDegrees"
  }, 
  "geometryType" : "esriGeometryPolygon", 
  "spatialReference" : {
    "wkid" : 4326, 
    "latestWkid" : 4326
  }, 
  "fields" : [
    {
      "name" : "name", 
      "type" : "esriFieldTypeString", 
      "alias" : "name", 
      "sqlType" : "sqlTypeOther", 
      "length" : 2048, 
      "domain" : null, 
      "defaultValue" : null
    }
  ], 
  "features" : [
    {
      "attributes" : {
        "name" : "Commerzbank DLZ 1"
      }, 
      "geometry" : 
      {
        "rings" : 
        [
          [
            [8.65442100000007, 50.1062332000001], 
            [8.65432050000004, 50.106339], 
            [8.65446150000002, 50.1063941], 
            [8.65428550000007, 50.1065795000001], 
            [8.65414460000005, 50.1065244000001], 
            [8.65404280000007, 50.1066317], 
            [8.65526350000005, 50.1071017], 
            [8.65531910000004, 50.1071231000001], 
            [8.65571470000003, 50.1072820000001], 
            [8.65616050000006, 50.1068196000001], 
            [8.65598180000006, 50.1067498], 
            [8.65591740000002, 50.1068175], 
            [8.65524090000002, 50.1065534000001], 
            [8.65504730000004, 50.1064778000001], 
            [8.65501150000006, 50.1064635000001], 
            [8.65442100000007, 50.1062332000001]
          ], 
          [
            [8.65536580000003, 50.1069989000001], 
            [8.65552820000005, 50.1068259], 
            [8.65557490000003, 50.1068439000001], 
            [8.65576940000005, 50.1069191], 
            [8.65560710000005, 50.1070920000001], 
            [8.65541540000004, 50.107018], 
            [8.65536580000003, 50.1069989000001]
          ], 
          [
            [8.65493320000007, 50.1068226], 
            [8.65509250000002, 50.1066524], 
            [8.65533340000007, 50.1067451000001], 
            [8.65517400000005, 50.1069153], 
            [8.65493320000007, 50.1068226]
          ], 
          [
            [8.65446870000005, 50.1066457000001], 
            [8.65464290000006, 50.1064629000001], 
            [8.65491170000007, 50.1065683], 
            [8.65473750000007, 50.1067511000001], 
            [8.65446870000005, 50.1066457000001]
          ]
        ]
      }
    }
  ]
}

f=pbf, run through esriPBuffer.FeatureCollectionPBuffer.decode(new Uint8Array(readFileSync("file.pbf"))) and dumped using the Node inspector:

FeatureCollectionPBuffer {
  queryResult: QueryResult {
    featureResult: FeatureResult {
      fields: [ Field { name: 'name', fieldType: 4, alias: 'name' } ],
      values: [],
      features: [
        Feature {
          attributes: [ Value { stringValue: 'Commerzbank DLZ 1' } ],
          geometry: Geometry {
            lengths: [ 16, 7, 5, 5 ],
            coords: [
              632527880,     -1677,   -100500, -105800,  141000,
                 -55100,   -176000,   -185400, -140900,   55100,
                -101800,   -107300,   1220700, -470000,   55600,
                 -21400,    395600,   -158900,  445800,  462400,
                -178700,     69800,    -64400,  -67700, -676500,
                 264100,   -193600,     75600,  -35800,   14300,
                -590500,    230300, 633472680,   -1677,  162400,
                 173000,     46700,    -18000,  194500,  -75200,
                -162300,   -172900,   -191700,   74000,  -49600,
                  19100, 633040080,     -1677,  159300,  170200,
                 240900,    -92700,   -159400, -170200, -240800,
                  92700, 632575580,     -1677,  174200,  182800,
                 268800,   -105400,   -174200, -182800, -268800,
                 105400
            ]
          }
        }
      ],
      objectIdFieldName: 'OBJECTID',
      uniqueIdField: UniqueIdField { name: 'OBJECTID', isSystemMaintained: true },
      geometryProperties: GeometryProperties {
        shapeAreaFieldName: 'Shape__Area',
        shapeLengthFieldName: 'Shape__Length',
        units: 'esriDecimalDegrees'
      },
      geometryType: 3,
      spatialReference: SpatialReference { wkid: 4326, lastestWkid: 4326 },
      transform: Transform {
        scale: Scale { xScale: 1e-9, yScale: 1e-9 },
        translate: Translate { xTranslate: -400, yTranslate: -400 }
      }
    }
  }
}

I suspect that the first pair in each ring is not being parsed correctly (632527880, -1677 etc), but even ignoring those I don't see how to convert e.g. -100500, -105800 into 8.65432050000004, 50.106339 using the Transform provided. I assume you have to map those (signed?) integer values over the layer envelope or something, but can't find docs about that.

mmgeorge commented 3 years ago

Thanks for replying @mmgeorge . It sounds like some of the fields in the PBF definition aren't actually used. Would it make sense to trim them out of the definition? Or maybe just update the docs to note that the fields are deprecated / legacy / reserved / whatever?

Well some of them are used outside of the context of the JSAPI specifically, and some are used for parity with f=json. GeometryType (in the Feature message) though should probably be omitted. Will look into removing that

Did you see my question about applying the transform over at Esri/arcgis-rest-js#702 ? I had to put this on the back burner since then, because I just can't seem to figure out how to convert the integer coordinates to decimal degrees (4326) correctly. I noticed that in your f=json example above, there's a transform defined as well. I don't see that in the sample I grabbed.

Ah ok yes the geometry decoding is a little complicated, especially if you are not already using quantization/generalization. Basically for PBF, we need to encode the coordinates array as integers (well we could encode them as floats, but varint variable-length integer encoding is one of the primary reasons for using PBF). Because of this, f=pbf always returns quantized coordinates, even if quantization parameters are not specified. In the event of f=pbf and not quantization parameters, the quantization parameters default to the upper left of the map. To unpack these, you need to un-delta encode and then multiply by the included transformation.

The f=pbf mode was originally designed specifically for "tile" queries against feature services. Here's quick walkthrough of how we can unpack one of these queries:

The query I posted above is typical of the queries we make in the JSAPI, e.g.: https://servicesdev.arcgis.com/VdB0O4Dy5MyNfFTR/arcgis/rest/services/KansasCityParcels/FeatureServer/0/query?f=json&geometry=%7B%22spatialReference%22%3A%7B%22latestWkid%22%3A3857%2C%22wkid%22%3A102100%7D%2C%22xmin%22%3A-10551978.880712243%2C%22ymin%22%3A4742764.731040362%2C%22xmax%22%3A-10549532.895807117%2C%22ymax%22%3A4745210.715945488%7D&maxRecordCountFactor=4&resultOffset=0&resultRecordCount=8000&where=1%3D1&orderByFields=OBJECTID_1%20ASC&outFields=OBJECTID_1%2CTotLivArea&outSR=102100&quantizationParameters=%7B%22extent%22%3A%7B%22spatialReference%22%3A%7B%22latestWkid%22%3A3857%2C%22wkid%22%3A102100%7D%2C%22xmin%22%3A-10551978.880712243%2C%22ymin%22%3A4742764.731040362%2C%22xmax%22%3A-10549532.895807117%2C%22ymax%22%3A4745210.715945488%7D%2C%22mode%22%3A%22view%22%2C%22originPosition%22%3A%22upperLeft%22%2C%22tolerance%22%3A4.77731426782227%7D&resultType=tile&spatialRel=esriSpatialRelIntersects&geometryType=esriGeometryEnvelope&inSR=102100

Delta-encoding

Basically on the client we divide up the world into 512x512 pixels tiles, and query such that features in those tiles are relative to the top-left corner of each tile.

For example (apologies for the bad paint job 😅): encoding

The numbers here are fake, but let's suppose this polygon returns coords: [128, 384, 20, 0, 0, 20, -20, 0, 0, -20] lengths: [5]

In this case you can see the first vertex position is much larger in magnitude than the subsequent vertices. This is because it is relative to the tile's upper left corner, i.e., it refers to the blue line in the image above. Every other coordinate is a relative, so (20, 0) means "move 20 units to the right". Note that in this example, because coordinates are relative to the tile's upper-left corner, positive y is downward much like it is in screen-space. So (0, 20) means "move down 20 units". Why encode relative vertices this way? Mainly to reduce the size of the payload -- smaller numbers use less space for PBF varints.

To unpack the delta-encoded vertices we can do:

for (let i = 1; i < lengths[0].length; i++) { // Start at 1
  coords[2 * i] += coords[2 * (i -1)]; 
  coords[2 * i + 1] += coords[2 * (i -1) + 1]; 
}

This will give us: [128, 384, 148, 384, 148, 404, 128, 404, 128, 384]

Converting to world coordinates

We now have unpacked, non-delta encoded vertices, but they are still relative to the tile's upper-left origin. If you now need to convert back to world coordinates, you need to multiply each vertex by the included transformation in the featureSet:

transform

To do this, you can just do:

const xWorld = x * scale.x + translate.x;
const yWorld = translate.y - y * scale.y;

Note that in the above query we request the payload in spatialReference 3857 (web-mercator), so xWorld and yWorld will be in mercator map units.

mmgeorge commented 3 years ago

Actually I just realized that the PBF file posted in this repository is out of date, coords should be sint64 not sint32. We made this change to support non-tiled queries which will otherwise overflow. That might be what you are running into @thw0rted ... we will get that fixed.

thw0rted commented 3 years ago

Thanks, the delta-encoding was the confusing part, but I totally understand why they do it now -- protobuf encodes smaller values in fewer bytes. It does raise the question of why the server I linked to uses a huge initial offset for each ring, with a (relatively) tiny transform.translate, rather than a smaller delta-pack offset with a translate value that gets to the right ballpark.

Just to close the loop here, in my posted example, the first coordinate of the ring does not change and would be 632527880, -1677 relative to the top-left of a tile. The world coordinates would be (632527880 * 1e-9) - 400 = -399.36747212 and (-1677 * 1e-9) - 400 = -400.000001677. The same un-quantized point is 8.65442100000007, 50.1062332000001 decimal degrees, per the f=json response. The values I got (just barely either side of -400) are obviously not degrees or meters, it must be (fractional?) pixels in a tile. I don't see anything in my query or response to indicate "which" tile this is, though.

I found a blurb in the REST docs that says

When using outSR with pbf, the pbf format will use coordinate quantization for layer queries. When an output spatial reference is not provided for a query operation, the feature service derives coordinate quantization parameters from the layer’s spatial reference.

(emphasis mine)

The layer definition gives the layer SR (extent.spatialReference.wkid) as 4326. I would expect the world coords (after applying transform) to be in 4326 (i.e. decimal degrees). It seems like maybe it's not possible to get a sensible result without explicitly specifying the quantizationParameters on the query? Alternately, maybe it's impossible to get quantized 4326 output?

I tried adding quantizationParameters. That should decode to

{
    "extent": {
        "spatialReference": {
            "latestWkid": 4326,
            "wkid": 4326
        },
        "xmin": 50.1,
        "ymin": 8.6,
        "xmax": 50.2,
        "ymax": 8.7
    },
    "mode": "view",
    "originPosition": "upperLeft"
}

but the PBF is exactly the same as above, i.e. with transformed world coordinates very close to -400,-400. I still don't see a way to make sense of that.

mmgeorge commented 3 years ago

@thw0rted I think your first coordinate is getting truncated because the published pbf proto file is typing the coords as sint32 instead of sint64. This is what I'm seeing using a parser built using the correct schema:

Query result

1

Transform

2

Calculating X:

3

thw0rted commented 3 years ago

Beautiful, thanks!

Protobuf is still a bit of a mystery though -- why on earth is it better to encode (408.NNNe9 * 1e-9) - 400 instead of just 8.NNNe9 * 1e-9 with no translate? Better yet, (N.NNe8 * 1e-8) + 8 or (N.NNe7 * 1e-7) + 8.6, etc?

Anyway this should get me going. If I have a chance, I'll come back and post some code here with whatever I figure out.

mmgeorge commented 3 years ago

@thw0rted TBH I'm not entirely sure what secret sauce the server is using to compute this, next time I run into the server folks I'll have to ask 😆. The original release of the PBF stuff was only for our tile queries (I work on FeatureLayer rendering & helped out with the initial work on this), and in that case the transform is a well defined thing based on what's passed in the actual request.

Probably there's some implementation reason the transform is returned like this (for this specific case)-- e.g., maybe the transform is computed based on more than just the single feature that winds up in the final selection because it hits some internal server cache, it actually is the best for reason X, or it just has to be done this way, etc. That's just speculation on my part though. I wouldn't expect this to really have any noticeable impact on decode performance though.

thw0rted commented 3 years ago

As promised, here's what I'm using now:

import type { Position } from "@esri/arcgis-rest-types";
import { esriPBuffer } from "./FeatureCollection";

// Extract one line-string with `count` 2D (hasZ=false) or 3D (hasZ=true)
// coordinates, from a delta-encoded number array, starting at the given
// `offset`. Then, apply the provided `transform` to each coordinate.
export function transformLine(coords: number[], offset: number, count: number, hasZ: boolean, transform: esriPBuffer.FeatureCollectionPBuffer.Transform): Position[] {
    const ret: Position[] = [];
    const size = hasZ ? 3 : 2;
    const decoded = coords.slice(offset, (offset + size) * count);
    deltaDecode(decoded, hasZ);
    for (let i = 0; i <= decoded.length + size; i += size) {
        const pos = transformTuple(decoded.slice(i, i + size), transform);
        // Shouldn't return undefined, hopefully
        if (pos) { ret.push(pos); }
    }
    return ret;
}

// Apply the provided transform to a single point
function transformTuple([x,y,z]: number[], transform: esriPBuffer.FeatureCollectionPBuffer.Transform): Position | undefined {
    if (undefined === x || undefined === y) { return; }  // Shouldn't happen

    if (transform.scale) {
        x *= transform.scale.xScale;
        y *= transform.scale.yScale;
        if (undefined !== z) { z *= transform.scale.zScale; }
    }
    if (transform.translate) {
        x += transform.translate.xTranslate;
        y += transform.translate.yTranslate;
        if (undefined !== z) { z += transform.translate.zTranslate; }
    }

    const ret = [x,y] as Position;
    if (undefined !== z) { ret.push(z); }
    return ret;
}

// Unpack a delta-encoded 2D (hasZ=false) or 3D (hasZ=true) linear-ring in-place.
function deltaDecode(coords: number[], hasZ: boolean): void {
    const size = hasZ ? 3 : 2;
    for (let i = size; i + size <= coords.length; i += size) {
        // Assert because the input array is not sparse
        // eslint-disable-next-line @typescript-eslint/no-non-null-assertion
        coords[i] += coords[i - size]!;
        // eslint-disable-next-line @typescript-eslint/no-non-null-assertion
        coords[i + 1] += coords[i - size + 1]!;
        // eslint-disable-next-line @typescript-eslint/no-non-null-assertion
        if (3 === size) { coords[i + 2] += coords[i - size + 2]!; }
    }
}

The result of calling transformLine can be used as one ring or path in IPolygon / IPolyline from arcgis-rest-types. (My plan is to convert the PBF response into something very similar to the f=json response, so that I can consume either without caring what transport format was used over the wire.)

ETA: Almost forgot, I wrote function alwaysNum(x: number|Long): number { return Long.isLong(x) ? x.toNumber() : x; } so that you can call transformLine(feat.coords.map(alwaysNum), offset, len, hasZ, transform); The transform.scale.xScale etc are always going to be floats, so you might as well lose Long precision for your input coords immediately. As far as I can tell, there's no way to retain full floating-point precision after multiplying -- Long#mul only outputs integers.

ETA again: I also had to add

    // Default or "upper" origin means Y is flipped
    if (transform?.scale && transform.quantizeOriginPostion !== esriPBuffer.FeatureCollectionPBuffer.QuantizeOriginPostion.lowerLeft) {
        transform.scale.yScale *= -1;
    }

just after initially decoding the payload. It's effectively the same as the formula a few posts back, where in "upperLeft" origin mode (the default), the world Y coordinate is given as yTranslate - (yCoord * yScale).

rowanwins commented 3 years ago

@thw0rted are you planning on publish an arcgis-pbf-arsing library by any chance? I'd be happy to collaborate on something if you were going to open source it.

rowanwins commented 3 years ago

Ookie dokie well I've got a geometry parsing set up, needs a bit of tidying but this is the guts of it.

import {esriPBuffer as EsriPbfBuffer} from './parser/FeatureCollection'
import Long from 'long'
import * as fs from 'fs'

const buffer = fs.readFileSync('a/pbf-with-no-quantization.pbf')

const pbfObj = EsriPbfBuffer.FeatureCollectionPBuffer.decode(buffer)
const transform = pbfObj.queryResult.featureResult.transform

const f = obj.queryResult.featureResult.features[0]
const coords = deltaEncode(f.geometry.coords, transform)

const out = {
  type: "FeatureCollection",
  features: [
    {
      type: "Feature",
      properties: {},
      geometry: {
        type: "Polygon",
        coordinates: [coords]
      }
    }
  ],
  "crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:EPSG::3857" } }
}

fs.writeFileSync("out.geojson", JSON.stringify(out))

// Apply the provided transform to a single point
function transformTuple(coords, transform) {
  let x = coords[0]
  let y = coords[1]
  let z = coords[2]
  if (transform.scale) {
      x *= transform.scale.xScale;
      y *= -transform.scale.yScale;
      if (undefined !== z) { z *= transform.scale.zScale; }
  }
  if (transform.translate) {
      x += transform.translate.xTranslate;  
      y += transform.translate.yTranslate;
      if (undefined !== z) { z += transform.translate.zTranslate; }
  }
  const ret = [x, y];
  if (undefined !== z) { ret.push(z); }
  return ret;
}

function deltaEncode (arr, transform) {
  const out = []
  const initialX = arr[0]
  const initialY = arr[1]

  out.push(transformTuple([initialX, initialY], transform))

  let prevX = initialX
  let prevY = initialY
  for (let i = 2; i < arr.length; i = i+2) {
    const x = difference(prevX, arr[i])
    const y = difference(prevY, arr[i + 1])
    const transformed = transformTuple([x, y], transform)
    out.push(transformed)
    prevX = x
    prevY = y
  }
  return out
}

function difference(a, b) { 
  // Note this uses the long npm package
  return a.add(b)
}

There are a few bits hard-coded in there but hopefully it provides enough insight as to what needs to happen.

Cheers

thw0rted commented 3 years ago

Yeah, I could publish some of this, but I figured what I had pasted was enough to be useful without being "big" enough to warrant a whole project. Your usage looks good, though it sounds like you're doing a side-effect import of Long and leaving the coords as Long instances until they hit the built-in multiplication operator in transformTuple. I guess they get coorced to number there?

I don't really get why they're using Longs for this anyway. Native JS numbers give around 15 digits of precision. In 4326 SR, "only" 9 decimal places gets you down to millimeter (!) precision at the equator. Is somebody describing their microprocessor designs using ArcGIS features? Otherwise, feels like overkill.

mmgeorge commented 3 years ago

Hmm @thw0rted not sure exactly what you mean. 32 bit integer coords is not enough precision to represent the world, you'll get a wobble when zooming in, unless you specify those coords relative to some local origin. However, JS can represent 53 bit integers safely, so it isn't necessary to generate longs even though you have > 32 bits of precision required. Note that pbf allows for 32 or 64 bit ints, not values in between.

When using protobufjs, you can use --force-number to avoid creating longs, which in general I would recommend.

thw0rted commented 3 years ago

Ah! I need to spend more time with protobuf, it didn't even occur to me that the choice was between 32 vs 64, rather than 53 vs 64. (When I said "15 digits", I meant that a 53-bit float is accurate to around 15 decimal places.)

That's a good tip about force-number, I'll definitely go back through and do that. Thanks!

thw0rted commented 3 years ago

FYI I just got around to trying force-number and it looks like all it does is remove |Long from the JSDoc (and thus generated TS) types. As long as $protobuf.util.Long is defined, runtime behavior does not change, the decoder for Geometry still populates coords using reader.sint64(), and that method still actually returns an instance of Long.

I'm using npm to install protobufjs, which declares long as a transitive dependency, so the library is always installed in node_modules. I lost the plot a bit trying to read the source for protobufjs/util but I think whatever they're doing to test for the presence of long -- possibly along the lines eval("require('long')")? -- causes Webpack to include it (if you're using Webpack...).

For the time being, rather than wrestling with Webpack to make it exclude long, I'm probably going to leave it alone.

rowanwins commented 3 years ago

The other thing I discovered on this front was that if you generate your own parser using the pbf library from mapbox then it automatically handles the long issue by simply not supporting them :)

FWIW I've published my work here https://github.com/rowanwins/arcgis-pbf-parser

thw0rted commented 3 years ago

Thanks for the heads-up, I wasn't aware of Mapbox's competing library and will definitely have to look into it. My only hangup is that the Mapbox version doesn't seem to generate any kind of type information (JSDoc or .d.ts), and I'm currently on protobuf.js's Typescript typings, so I'd have to find a way to shoehorn those in or roll my own.

When you say long is "not supported", you just mean that higher-precision values will be decoded and stored in a number, right? (As opposed to, say, failing to return the value at all.)

rowanwins commented 3 years ago

When you say long is "not supported", you just mean that higher-precision values will be decoded and stored in a number, right? (As opposed to, say, failing to return the value at all.)

Your interpretation is correct :)

mmgeorge commented 1 month ago

Potentially would also be good to flesh out more doc concerning transforming the geometry (https://github.com/Esri/arcgis-pbf/issues/11). This repo does assume knowledge of the rest API, so it would probably be good to link to it.