playcanvas / editor

Issue tracker for the PlayCanvas Editor
https://playcanvas.com/
154 stars 28 forks source link

glTF asset import followup work #1000

Open slimbuck opened 1 year ago

slimbuck commented 1 year ago

Following on from #963, these are the remaining work items:

tims-realityi commented 6 days ago

I am interested in "Support importing gltf files (as opposed to glb) with embedded data", if I'm understanding it correctly, for the following purpose. I have a large project in blender, and a full export GLB is larger than PlayCanvas editor's 340 MB asset import limit. I would like to be able to run the textures through an external texture compression script, before importing the textures into the project. So, I want a way to be able to import the images separately, yet have PlayCanvas able to automatically map the textures to materials for the "model". Blender has an export feature to export glTF as separate files, including mesh binary, glTF for mapping & materials etc, and separate texture files. So, if I could separately import the glTF & images separately from the mesh binary, and PlayCanvas could map it all together just like does when packaged together in a GLB, then all would be right in the world.

Maksims commented 5 days ago

I am interested in "Support importing gltf files (as opposed to glb) with embedded data", if I'm understanding it correctly, for the following purpose. I have a large project in blender, and a full export GLB is larger than PlayCanvas editor's 340 MB asset import limit. I would like to be able to run the textures through an external texture compression script, before importing the textures into the project. So, I want a way to be able to import the images separately, yet have PlayCanvas able to automatically map the textures to materials for the "model". Blender has an export feature to export glTF as separate files, including mesh binary, glTF for mapping & materials etc, and separate texture files. So, if I could separately import the glTF & images separately from the mesh binary, and PlayCanvas could map it all together just like does when packaged together in a GLB, then all would be right in the world.

This is unrelated to GLB directly but somehow is related.

It can be tempting to do a level editing in a modeling tool (e.g. Blender), but it is highly not recommended to do so due to multiple reasons:

  1. GLB while being a runtime format, it is not aware of application logic and loading processes, so you don't want to use one uber file.
  2. Applications should be in charge of how things are loaded, and loading multiple files in parallel is better than loading a single file. This also is beneficial from cache and updating points of view.
  3. All the settings on engine materials, textures and GLB counterparts are not 1 to 1, so you definitely want to adjust and specify more precise properties in engine, ensuring you have a full control on how things work and look.
  4. Publicly facing applications, that load more than 20Mbs of data, usually fail as appear "failed to load" to users, so it is extremely encouraged to always avoid large files, and use lazy-loading as much as possible. Even in professional tools context.
  5. Browsers will not cache large files, making subsequent loads slow.
  6. CDNs - cost money, and not using one will lead to extremely slow download times.

That is why decomposing a GLB into separate files, materials, textures, etc - is very beneficial in Editor, and from workflow point of view - same applies to splitting large models into smaller bits.

tims-realityi commented 5 days ago

Thanks for the great and informative response, Maksims!

I don't really want to import a 340MB scene. I was thinking I would try to overwrite the images, after (realizing now that might not work). Anyway I just need the mapping between all the textures and the materials/mesh.

I'm definitely keeping page download size down, and keeping file size down. That's part of why I want this ability, it would allow me to compress many images from high quality (source quality) to web target quality (optimized for performance). It seems this is a valuable pattern, being able to maintain high quality 3d model assets that may be down-sized as needed for different applications.

So, I could export from Blender, run a script, call it compressTexturesAccordingToFilename.sh, set project settings > asset tasks > import compress mesh as draco, import into PlayCanvas, and the source files would be optimized for web page download performance. Could then apply basis, etc, for further performance optimization. But having the ability to fine tune the source images, and run the full set through a compression script that compresses to specifications determined by a naming pattern, is a significant performance-related workflow.

If editor can import a GLB and convert the connective tissue into a "template" object (if I have that right), then I would imagine a similar workflow could occur, if a user tries to import a glTF after having imported mesh and texture assets with filenames that match those specified in the glTF data? Maybe only look in the same asset folder? And display a notification that importing this way will look for assets matching names in the glTF?

I agree that the workflow I'm suggesting has the tradeoff of some PlayCanvas properties needing to be manually modified, afterward, to fine tune things, and that this would be a repetitive task, for full glTF re-imports.

I'm realizing maybe this item slimbuck listed refers to what I'm describing: "Support importing gltf files with external references (as zip archive perhaps?)"