Closed erikdahlstrom closed 6 years ago
I like the duplicate detection, but that could be slow so a flag would be interesting here too. In general finding duplicate resources on a GLTF and simplifying could be a function on the toolkit.
As for performing LOD merging first, we could invert the order, but that also means we would have to perform changes in all affected LODs when packing, which can be a little tricky. I like having LOD merging be the last step.
I think my ideal solution would be to perform packing of LOD zero, then cache some hash of each resource when adding it to the final GLTF, and use these hashes to de-dupe when packing LODs. More specifically:
That way we won't have dupes + we only pack once, at the cost of hashing each file when packing (i.e. some optional param would be the map<resource hash, index of the final resource>.
What do you think @erikdahlstrom ?
Sounds good to me. I think the merger needs a bit more work to deal with the case where only some resources are shared, especially wrt getting the right offsets for all the dependent fields in the glTF json.
I have the addition of the commandline flag on a branch, https://github.com/erikdahlstrom/glTF-Toolkit/tree/add_material_flag, I'd be happy to submit it as a pull-request if you want.
Thanks! I think you should send as a PR when we have the hashing described above.
Closed by #14
Given a set of gltf input LODs that have the same materials for each LOD:
Some possible solutions: