KhronosGroup / glTF-Blender-IO

Blender glTF 2.0 importer and exporter
https://docs.blender.org/manual/en/latest/addons/import_export/scene_gltf2.html
Apache License 2.0
1.48k stars 317 forks source link

Export textures as WebP #1308

Closed makidoll closed 1 year ago

makidoll commented 3 years ago

Is your feature request related to a problem? Please describe. We're working on virtual world software where we have many 3D models loading all around from different http servers. Features such as draco and brotli compression help improve file sizes significantly, but for textures we've been using webp. Using our own Blender addon, it tries to convert textures to webp and modify gltf files afterward. However it's not intuitive and doesn't work with glb files. Most people avoid it because it doesn't import back in either.

Describe the solution you'd like When exporting gltf in Blender, under textures you can select webp. A new set of settings appears where you can toggle lossless mode, or otherwise select a quality between 0 to 100% (possible show this for jpeg as well). This should work with glb files too.

There's an extension for webp here: https://github.com/KhronosGroup/glTF/blob/master/extensions/2.0/Vendor/EXT_texture_webp/README.md Though we haven't been using it since webp just works.

I'm not sure how we could compress/decompress with webp in Blender, however for my addon I've packaged the cwebp executable into the addon for Linux, Windows and macOS.

Describe alternatives you've considered Manually using cwebp and modifying the gltf files to reference the correct texture. Then using a tool to package to glb.

I hope this can be looked into someday! I can help since I've touched the addon before and perhaps I might look into this myself.

donmccurdy commented 3 years ago

I'm not sure about adding WebP support to this addon directly ... there are probably two concerns:

  1. We aim to support all official (KHR) extensions that make sense in Blender, but not vendor-specific extensions, and decide on a case-by-case basis for multi-vendor (EXT) extensions. So this would need to be discussed, particularly given the more recent releases of WebP2 and AVIF.
  2. We can't put arbitrary dependencies into this Blender addon. I don't know the exact restrictions, but my understanding is that native code (or Python bindings for native code) have to be added to Blender directly. I believe Blender already depends on OpenImageIO, which has WebP plugins, so that might be the most natural option. I'm hoping we'll be able to contribute KTX2/Basis support to OIIO at some point in the future.

As a workaround, I'm also planning to add WebP compression support to glTF-Transform's CLI (usage: gltf-transform webp in.glb out.glb), along with optimization codecs for PNG and JPEG. You can find my work-in-progress here: https://github.com/donmccurdy/glTF-Transform/pull/148. That PR is currently blocked on an issue in an upstream dependency (https://github.com/GoogleChromeLabs/squoosh/issues/898).

Zingam commented 3 years ago

Wouldn't it make sense to add KTX2 support directly into blender to open/import-export KTX2 textures? Especially if you are working with game assets.

donmccurdy commented 3 years ago

It would a very helpful feature, but it'd be out of my abilities. Would be great to propose this to Blender and gauge their interest, and whether it'd be a direct addition or need to go through OpenImageIO. In the meantime you can convert a glTF model's textures to KTX2 after export, with https://gltf-transform.donmccurdy.com/cli.html.

Zingam commented 3 years ago

Thank you very much! Is this tool also available as standalone executable: https://gltf-transform.donmccurdy.com/cli.html I've been researching this topic in my spare time and I played with https://github.com/zeux/meshoptimizer to a great satisfaction. I am very positive about adopting Blender/gltf/ktx2 workflow. One topic that still alludes me is about shared textures. Exporting a models appears to assume the gltf to be a self contained unit together with the textures. What is the situation if a texture needs to be shared between multiple models in separate files or something more fancy like "mega-textures"? I guess additional custom post-processing would be required for these.

donmccurdy commented 3 years ago

Is this tool also available as standalone executable...

Not at the moment; it relies on having a Node.js and KTX-Software installation. In the future I think it might be possible to make a standalone executable with https://github.com/vercel/pkg.

meshoptimizer's gltfpack is also a great option 👍

What is the situation if a texture needs to be shared between multiple models in separate files

glTF-Transform should handle that fine if you've got the textures in a common folder. Worst case, would write the same image more than once and take longer than necessary to finish processing. If you're hosting things on the web, libraries like three.js will be smart enough to download the image only once when caching is enabled. I think mega-textures / texture atlases would work the same way, but I don't know what tools people use to create these.

It's also possible to bundle many glTF models into one glTF file with multiple scenes, several textures, and binary resources split across multiple .bin files. Just copy/pasting the output of gltf-transform partition -h for details here:

USAGE — partition

    ▸ gltf-transform partition <input> <output> [OPTIONS...]

  Partition binary data for meshes or animations into separate .bin files. In
  engines that support lazy-loading resources within glTF files, this allows
  restructuring the data to minimize initial load time, fetching additional
  resources as needed. Partitioning is supported only for .gltf, not .glb, files.

  ARGUMENTS

    <input>                              Path to read glTF 2.0 (.glb, .gltf) model
    <output>                             Path to write output

  OPTIONS

    --animations                         Partition each animation into a separate .bin file
                                         boolean
    --meshes                             Partition each mesh into a separate .bin file
                                         boolean

  GLOBAL OPTIONS

    -h, --help                           Display global help or command-related help.
    -V, --version                        Display version.
    -v, --verbose                        Verbose mode: will also output debug messages.
    --vertex-layout <layout>             Vertex layout method
                                         one of "interleaved","separate", default:
                                         "interleaved"
atteneder commented 2 years ago

Corresponding OpenImageIO feature issue

gernotziegler commented 2 years ago

Now that Blender 3.2 is out with WEBP support, it would be great to integrate support for the EXT_texture_webp extension into the Blender exporter script, maybe with a choice if WEBP is an optional or required part of this exported GLTF (which decides if JPEGs are exported in addition). Support for WEBP-based GLTF benefits all applications where storage needs are critical, most often: web applications. :-)

qroolik commented 2 years ago

Yes, an option for textures in webp format in gltf exporter would be greatly appreciated :)

donmccurdy commented 2 years ago

Personally I'd be glad to see EXT_texture_webp supported here if Blender's image APIs now allow it. I would advise against including fallback JPEG or PNG textures. As a practical matter I think that (while glTF allows this) including .webp alongside .jpeg or .png backups will mostly confuse people, and certainly has bad outcomes when the textures are all embedded in a .glb. For web developers, WebP is widely supported, and exporting two versions of the glTF is a simpler solution if PNG or JPEG is needed.

Support for WebP compression has since been added to gltf-transform as well:

gltf-transform webp input.glb output.glb
MarkCallow commented 1 year ago

Support for WEBP-based GLTF benefits all applications where storage needs are critical, most often: web applications. :-)

To @gernotziegler and all those who've expressed similar opinions, if you have so many or such large images that storage, and transmission presumably, is an issue, how do you avoid running out of GPU memory? Webp images, like JPEG and PNG images, have to be fully decompressed before uploading to the GPU as textures.

donmccurdy commented 1 year ago

... if you have so many or such large images that storage, and transmission presumably, is an issue, how do you avoid running out of GPU memory?

I expect that any sufficiently large graphics application — particularly at the scale and complexity of a game — is best served by GPU texture formats like KTX2. And often in much smaller applications too, for exactly the reasons you mention. In the kind of "virtual world software" described by the OP that may well be the best choice.

However, the web has a wide range of other content patterns. Consider VFX on a fancy brand's landing page, or 3D graphic illustrations on a news article. Visitors quickly leave a site if it doesn't load almost instantly, time to first-contentful-paint should be <2s, and page size budgets are commonly measured in kilobytes, not megabytes. Under those constraints I think web developers often want the smallest transmission size they can get, and would expect to use relatively little of their VRAM budget.

Other obstacles include the comparatively easy configuration of good lossless or near-lossless compression in WebP and AVIF formats, compared to more careful tuning often required in KTX2. Raising awareness around the VRAM differences in these formats is an important and ongoing challenge, too.

tdw46 commented 1 year ago

Would like to bump this issue as high priority. Another very common use case for WebP is simply reducing file size and load times for 3D web viewers; I do a lot of 3D client work and their main complaint about GLTF exports I provide from blender is their large size. But if I use JPEG instead of PNG the compression is too lossy compared to WebP.

If for example, I have a two 4K images used as atlas textures in a scene export, they end up being about 100MB in size together. While the geometry and other file data total to another 15-20mb for a 200k polygon scene. WebP can accomplish similar visual fidelity with the same two images at a size of 8MB a piece. For a total of 31-36MB. That's a 65% reduced load time.

Another format that should be considered is openEXP or some other HDR image format. Mozilla hubs has a GLTF exporter and extension for blender that utilizes HDR images for cubemaps and it works wonderfully. I have specific use cases for many clients who want emissive materials atlased into their textures with brightness greater than 1.

MarkCallow commented 1 year ago

You cannot assume smaller file size on the server is better. You need to weigh all the tradeoffs. For example, If you use webp, those two 4k textures will end up taking 100MB of GPU memory, because they have to be fully expanded, and loading that into the GPU will take some time. That time can be traded off against the increased transmission time of using a larger GPU-friendly block-compressed format. Currently the only such formats supported by glTF are those offered by the KHR_texture_basisu extension. See this page for more information.

When comparing quality you need to view the images on the relevant models. As much of the data will be viewed at less than full size, you may be able to use 2k images instead of 4k without any significant loss of quality in the final rendering.

Networks are getting faster. (I've had multiple companies vying to offer me gigabit FTTH which I now have. 10 gigabit is also available but overkill for me.) GPU memory use will quickly become the more pressing issue.

For HDR textures, the glTF WG will need to create an extension or adopt one from the community.

I don't know what the plans are for the Blender exporter.

I expect a new glTF compression tool soon which will support webp as well as the current compression formats. Also we will very soon release a new set of KTX tools which, among many other things, supports creating HDR textures from EXR images and exporting HDR texture images to EXR.

gernotziegler commented 1 year ago

You cannot assume smaller file size on the server is better. You need to weight all the tradeoffs. For my application case, I am fully aware of the tradeoffs - and having worked with OpenGL for 20 years, I know of the implications inside GPU memory between KTX2 over uncompressed textures ( I have tried KTX2 as well).

In my case, the loading time into the GPU (the bus transfer) is negligible in comparison to what the download of 3 additional Megabytes can take over a mobile connection. And it is these download times and amounts for mobile users that were the concern for the Art History Museum of Vienna, Austria (the page of concern was https://ironmen.khm.at ).

I must correct you on WebP texture support inside GLTF. There is support for WebP - it is just the export plugin of Blender that does not support it yet. https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/EXT_texture_webp/README.md

julienduroure commented 1 year ago

About Blender and Blender-glTF-IO:

MarkCallow commented 1 year ago

In my case, the loading time into the GPU (the bus transfer) is negligible in comparison to what the download of 3 additional Megabytes can take over a mobile connection. And it is these download times and amounts for mobile users that were the concern for the Art History Museum of Vienna, Austria (the page of concern was https://ironmen.khm.at/ ).

Thank you for the insight. If the use case is mobile devices I recommend you consider 2k images instead of 4k.

I must correct you on WebP texture support inside GLTF. There is support for WebP

I did not say there wasn't. I said HDR, meaning EXR, OpenEXP, or GPU formats for HDR, requires glTF extensions. If WebP supports HDR, something I am not aware of, then I apologize for my careless wording.

gernotziegler commented 1 year ago

Thank you for the insight. If the use case is mobile devices I recommend you consider 2k images instead of 4k.

The page has dynamic choice of texture resolution (4K, 2K) as well, of course. In fact, the asset has four possible sources of texture data: 2K-JPEG, 4K-JPEG, 2K-WebP, 4K-WebP. The page´s javascript chooses 2K or 4K depending on what three.js has determined to be the maximum resolution, and the GLTF contains the WebP as an additional, but optional source of texture data. Example in: Resolution choice: https://www.geofront.eu/hosted/khm/wexhibit140/mainray.js

let chosen_variant = '4Kwebp'; if (renderer.capabilities.maxTextureSize < 4096) chosen_variant = '2Kwebp';

(one could also choose for 2K depending on download bandwidth, but we found then that most mobile viewers would only allow 2K for the max texture resolution, anyways)

JPEG or WebP (implicitly done by three.js´ GLTF loader): https://www.geofront.eu/hosted/khm/wexhibit140/models/harnisch/4Kwebp/model.gltf See sections "extensions" and "images".

I must correct you on WebP texture support inside GLTF. There is support for WebP

I did not say there wasn't. I said HDR, meaning EXR, OpenEXP, or GPU formats for HDR, requires glTF extensions. If WebP supports HDR, something I am not aware of, then I apologize for my careless wording.

I also miss the support of HDR in the GLTF format, but also in the browser itself. One could use the alpha channel of a lossless WebP to transmit an exponent for each pixel, or transmit the channels through two WebPs, where one file is responsible for the exponent for each pixel´s color channel (this WebP must of course be lossless). Also, three.js can read deep-channel OpenEXR´s, and Blender can export them; thus it is easy to side-channel GLTF for a start in the path modeling->viewer - what is left then, is to come up with an experimental extension, that can later become part of a standardization process.

donmccurdy commented 1 year ago

@gernotziegler I expect there would be substantial objections to standardizing use of OpenEXR textures within glTF files. What @MarkCallow describes above, "creating HDR [KTX2] textures from EXR images" is far more runtime-friendly for engines like three.js.

vis-prime commented 1 year ago

Importing webP will also be a part of this right ? or should I make a separate issue

julienduroure commented 1 year ago

Importing webP will also be a part of this right ? or should I make a separate issue

No need for a new ticket, I will manage import/export at the same time