KhronosGroup / glTF-Validator

Tool to validate glTF assets.
Apache License 2.0
366 stars 60 forks source link

Animation data validation #20

Open lexaknyazev opened 7 years ago

lexaknyazev commented 7 years ago

Right now, only a couple animation binary data checks are performed (in addition to accessor.count, type, and componentType):

Next possible features:

Thoughts? @pjcozzi @bghgary @emackey @javagl @UX3D-nopper @donmccurdy

pjcozzi commented 7 years ago

All the new checks sound worthwhile to me.

donmccurdy commented 7 years ago

Looks good!

Detecting unneeded frames as an INFO or WARNING-level log sounds more useful than constant framerate, I think.

emackey commented 7 years ago

One thing to consider is where we draw the line between the responsibilities of the glTF exporter vs. the asset author. For example, in the case of unneeded keyframes and degenerate triangles and so on, some asset creation packages are prone to allowing the author to produce large quantities of these in the normal course of creating an asset, often without much indication. Are all glTF exporters expected to weed these out and optimize the output, just to pass validation with no warnings?

We have some content authors pushing back on this notion in the COLLADA2GLTF repo. They are authoring glTFs with the occasional empty node, that will be used for some purpose in their runtime, or a UV map that is not referenced within the glTF, that the runtime will use for its own purposes. If the optimization stages are too aggressive, we limit some usefulness and flexibility of glTF, flexibility that is apparently encouraged through the use of extras that could be optimized away.

It might be nice for the validator to have a mode where it reports on possible un-optimized behavior, such as redundant animation keyframes, but (a) this speaks more to the asset author or the authoring process than the quality of the exporter code, and (b) some authors want the glTF to preserve some things that could be viewed by others as un-optimized.

scurest commented 6 years ago

Un-needed samples aren't necessarily unoptimized. Removing the sample may lose sharing between accessors and thus increase overall size. Suppose you have a bunch of floats like this

t0 t1 t2 ... tn
a0 a1 a2 ... an
b0 b1 b2 ... bn

and you have two curves: both use t0...tn as inputs, the first uses a0...an as outputs, and the second uses b0...bn as outputs. If exactly one point, say the mth, in the middle of the second curve is un-needed, removing it

netting an increase of >4(n-1) bytes.