Closed bennlich closed 3 years ago
Hm, I suppose it has that limitation to avoid DoS attacks... but surprised it needs that much memory for an 8K texture, I would think that would be closer to 250 MB...
I've "wrapped" the get-pixels
dependency already (https://github.com/donmccurdy/ndarray-pixels/) to provide web browser support, maybe there's some workaround there if theget-pixels
maintainer doesn't have other suggestions.
Are you able to successfully run the script below for your model at https://gltf.report/? Pretty big memory ask for a web browser but worth a try... in this case it relies on Canvas 2D instead of get-pixels
(which is Node.js-only).
import { textureResize } from '@gltf-transform/functions';
await document.transform(textureResize({size: [16, 16]}));
Unfortunately there doesn't seem to be much to be done about this; it may be necessary to resize these images separately with a project like sharp. I'll keep an eye on the upstream bug on get-pixels
.
Hi @donmccurdy, I'm still encountering this issue (8k texture, memory exceeded) with the 2.1.5 release. I wonder if the @squoosh/lib integration should have fixed this? Also, I tried to run texture resize on gltf-Report with the same file and it worked just fine. Can you give me some hints here?
Hi @proto-ziii! No changes have been made here, the summary is:
I can't control the memory limits in browsers or node.js, so https://gltf.report/ may be able to access more memory than your node.js environment, or may have a lower memory footprint for other reasons.
More efficient resizers include squoosh/lib or sharp.js, and either could be used for custom versions of the textureResize() function. I'll likely switch to squoosh/lib for resizing only if a web build is published (https://github.com/GoogleChromeLabs/squoosh/issues/1084).
I see. Will try to impliment the sharp.js. Thanks for the details and this awesome project!
Just wanted to chime in that I've also ran into this now, found this issue, and hoping for squoosh/lib integration at some point :)
/cc https://github.com/donmccurdy/glTF-Transform/issues/647
For folks running into this issue, the manual workaround for glTF files hitting this error would be to unpack the textures (if they aren't already separate) and resize large textures manually.
gltf-transform cp input.glb output.gltf
The number of textures and size of textures both contribute to memory limits, with ≤4K being generally recommended. Aim for powers of two on each side, like 512px, 1024px, or 2048px. The textures do not need to be square.
Yes this worked for me, thanks for the response on Stack Overflow @donmccurdy, what I did, for anyone else interested, was:
npx gltfjsx ./myfile.gltf --types --shadows --transform --aggressive --draco https://www.gstatic.com/draco/versioned/decoders/1.4.1/
to squish it into a nicely sized glb fileWhen you do 5., the tsx generated might include a .
in the file name, so either rename them in blender in step 3., or write typescript like so in the generated tsx:
type GLTFResult = GLTF & {
nodes: {
Mesh_0001: THREE.Mesh
}
materials: {
'material_0.001': THREE.MeshBasicMaterial
}
}
<mesh castShadow receiveShadow geometry={nodes.Mesh_0001.geometry} material={materials['material_0.001']} />
Describe the bug Resizing large jpeg textures fails with
error: maxMemoryUsageInMB limit exceeded
.To Reproduce Run
gltf-transform resize --width 512 --height 512
on a model with an 8192x8192 jpeg texture.Additional context This seems to be an upstream bug in the scijs/get-pixels dependency. I opened an issue here: https://github.com/scijs/get-pixels/issues/57.