Open schlegelp opened 1 year ago
Glad it's useful! Oh interesting, yeah we should probably raise on exports that overflow the size buffer (PR's welcome :). A "zipped GLTF bundle" might be the way to go. It should result in a much smaller file size (GLTF doesn't support compression other than draco), and trimesh at least is happy to load straight from the zip:
In [15]: m = trimesh.load('bigmodel/tray-assembled.3DXML')
In [16]: e = m.export(file_type='gltf')
In [17]: type(e)
Out[17]: dict
In [18]: z = trimesh.util.compress(e)
In [19]: with open('tray.gltf.zip', 'wb') as f:
...: f.write(z)
...:
In [20]: _ = m.export(file_obj='tray.glb')
In [21]: type(m)
Out[21]: trimesh.scene.scene.Scene
In [22]: r = trimesh.load('tray.gltf.zip')
In [23]: type(r)
Out[23]: trimesh.scene.scene.Scene
Results in:
mikedh@luna:trimesh$ ls -altrsh
620K -rw-rw-r-- 1 mikedh mikedh 619K Nov 30 11:05 tray.gltf.zip
2.4M -rw-rw-r-- 1 mikedh mikedh 2.4M Nov 30 11:05 tray.glb
8.0K drwxr-xr-x 1 mikedh mikedh 7.6K Nov 30 11:05 .
Hi! Let me start by saying thanks for this great library!
I ran into an issue with the export to GBL files where re-import (into Blender in my case) raises an exception about file size not matching. Turns out the issue is that for this format the size of the content is encoded in the header as unsigned 32 bit integer (
<u4
, link) which maxes out at4,294,967,295
and the file I was trying to write was well over (7,620,571,812
).This is very much to specs and I don't see a way around it but I was wondering if it would be worth including a check when exporting the file? Perhaps trimesh could throw a warning when the file will go over that limit?
Finally, I was wondering if you had a recommendation for an alternative file format that allows combining many meshes into a single file?