Closed torrinworx closed 1 year ago
did you try to clear the stored data after each export ?
for bpy_data_iter in (
bpy.data.objects,
bpy.data.meshes,
bpy.data.lights,
bpy.data.cameras,
bpy.data.images,
bpy.data.textures,
bpy.data.materials,
):
for id_data in bpy_data_iter:
bpy_data_iter.remove(id_data)
for collection in bpy.data.collections:
bpy.data.collections.remove(collection)
for action in bpy.data.actions:
bpy.data.actions.remove(action)
for rig in bpy.data.armatures:
bpy.data.armatures.remove(rig)
@optimus007 Wouldn't this just remove every asset in the Blender file itself?
@optimus007 Wouldn't this just remove every asset in the Blender file itself?
Yup, after each gltf export if you clear these files before loading the next file, would it help with the memory usage issue ?
This is just a suggestion, not sure if it'll solve your issue though
Ok I just tested this with Blender 3.3.1 Release Candidate and it seems the issue is solved, no memory leak. I want to wait until the version has been released proper before closing this ticket though.
Ok, thanks, great news :) Let stay this ticket open until 3.3.1 is officially released. I will close the duplicate dev.blender.org ticket Thanks!
Awesome, thank you @julienduroure and @optimus007
3.3.1 is now officially released. Can you please confirm that everything is good now?
@torrinworx Any news? Did you get time to check with 3.3.1 that the problem is solved?
@julienduroure All good on my end! Thank you for the reminder, been a bit busy with work and forgot about this ticket. Have a nice day!
Describe the bug When exporting large amounts of large .gltf/.glb files in Blender, system memory constantly increases, Blender then crashes.
So I have a script running that exports thousands of models from Blender to a .gltf file(s). These models are big, about 200MB each, and the goal is to export about 6,000 of them. What my script does is select the objects in the Blender file scene, then exports them. After the code runs for about 17 loops, exporting 17 models, Blender crashes and an error message appears saying that Blender has run out of VRAM:
RuntimeError: Error: System is out of GPU and shared host memory
I've ran multiple tests with this setup and crashes every time. As the software is running the system memory slowly creeps up to 100% in Task Manager, then Blender crashes. I've used GPU + CPU compute, only GPU compute using only my 3080 ti, but nothing seems to change.
My team and I believe the textures are the source of the issue, but we aren't sure. The reason we suspect the textures is because the external texture folder is about 11.5GB.
To Reproduce Steps to reproduce the behavior:
Use the following script to export a large amount of .gltf/.glb models from Blender:
Expected behavior Blender maintains constant memory use and doesn't crash when continually exporting lots of .gltf/.glb files.
Screenshots Blender crash log:
.blend file/ .gltf I cannot include files here do to my clients confidentiality. But any Blender file 200-300MB in size with at least 10GB of external textures will do.
Version
Additional context System Specs:
GPU: RTX 3080 ti + GeForce GTX 1660 RAM: 4x16GB = 64Gb CPU: Intel Core i7-7700 3.60GHz