immersive-web / webxr-input-profiles

WebXR Gamepad assets, source library, and schema
https://immersive-web.github.io/webxr-input-profiles
Other
191 stars 50 forks source link

Compressing assets textures and a possible asset pipeline optimization? #155

Open riccardogiorato opened 4 years ago

riccardogiorato commented 4 years ago

What have I done?

I have done a rough compression of some of the biggest assets combining a simple decompression from glb to gltf with texture extraction, compression of textures and then went back again to glb. I'm doing these tests in my fork: https://github.com/Giorat/webxr-input-profiles

I used:

All done with a simple bat file cause I'm using windows but should be similar to what you can do with a simple linux/mac script.

Idea for an optimization pipeline

If it's needed I could also add in a folder you want a more complete node script to automatically compress new models for the future assets? This could be also useful if we want to produce automatically models with Draco compression or Basis texture easily.

Side question always on models

I have seen that some of the assets have png textures and others have jpg. What format should be used on all of them? Using png could help to keep the quality fixed without artifacts due to compression.

riccardogiorato commented 4 years ago

Taken from @toji comment on my closed first PR here: https://github.com/immersive-web/webxr-input-profiles/pull/156 "I tried reading up on tinypng, but I'm still fuzzy on a couple of things. It is a lossy compression algorithm, right? While that is probably OK for base color maps I'm concerned that it would introduce noticable artifacts into non-sRGB inputs, such as normal maps or metallic-roughness maps. Also, does repeatedly running tinypng on an image cause stacking artifacts? And just to verify, the compressed files are still just standard PNGs, right?

To answer a question from the related issue: I would be totally fine with changing any JPG textures to PNG as long as it doesn't significantly increase file size. Many of the textures have large sections of flat or similar colors, so I suspect that we should be OK on that front with the tinypng compression."

Will post more options as comment here to discuss them before doing any other new PR on this issue.

AdaRoseCannon commented 4 years ago

Putting the compressing on the texture will help with download size but it will be loaded uncompressed on the end user's hardware.

Once Basis support is in GLTF it maybe better to use that format to get the compression benefits on both download and in graphics memory.

An additional form of compression which can be applied would be Draco compression for the Geometry. Which is something I think is already supported in GLTF but I haven't tried it myself yet.

riccardogiorato commented 4 years ago

Putting the compressing on the texture will help with download size but it will be loaded uncompressed on the end user's hardware.

In some cases this won't happen if you simply decrease textures colors. But it will change for sure with Draco that will require some decompression time.

toji commented 4 years ago

glTF already supports Draco (It's used with all the meshes on https://xrdinosaurs.com, for example) and while Basis isn't currently embeddable it could be distributed separately alongside the mesh, but that's not the real hold up in the case of using those two libraries.

The primary issue is that we want to ensure that the assets are accessible to as large a number of pages as possible. While implementing Draco and Basis isn't overtly difficult, we still don't want it to be a barrier to entry, and both libraries do come with download and startup overhead that may not make sense if these controller assets are the only thing you're using them for.

It is absolutely possible that we could make Draco/Basis compression part of the build process and have separate assets hosted in a different location for pages that do want to opt in to using them, but I don't have the time to put into that tooling at the moment. Also, given that we'd then be hosting multiple copies of each asset I'd prefer that we had a dedicated CDN available before making that leap. In the meantime, a compression method like this introduces no new overhead on the part of the page, which makes it an ideal first step into shrinking our asset sizes.