vpenades / SharpGLTF

glTF reader and writer for .NET Standard
MIT License
457 stars 72 forks source link

merge glb including skeletons and animations #118

Closed rdpeake closed 2 years ago

rdpeake commented 2 years ago

I have successfully used this library to merge glb's such that every scene is reflected. is it possible to use this library to also merge the skinning rigs and animations? is it also possible to create morph based animation tracks?

i'm currently using the following code (so i can work on memory streams) to merge glb arrays, some can include animations

 var final = new SharpGLTF.Scenes.SceneBuilder();

            Array.ForEach(models, model => {
                var passed = SharpGLTF.Schema2.ReadContext.Create((f) => File.ReadAllBytes(Uri.UnescapeDataString(f))).ReadBinarySchema2(model);
                final.AddScene(passed.DefaultScene.ToSceneBuilder(), System.Numerics.Matrix4x4.Identity);
            });

            final.ToGltf2().SaveGLB("test.glb");

All of the glb's share a rig.

vpenades commented 2 years ago

Yes, both skinning and morphing animation is possible.

rdpeake commented 2 years ago

So, how would i go about merging 2 scenes, but using the skinning bones from scene 1 on the skinned meshes in scene 2? Similar for copying animations, how do i retarget them to the bones in scene 1?

vpenades commented 2 years ago

Oh, if you want to mix parts from two different scenes, then it's a completely different ball game.

There's no in built functionality to do that in one method call... you have to fully construct the final SceneBuilder from the parts of the other scenes.

Notice that SharpGLTF is a library designed to export a model to glTF, it is not a general model utility library, Motion Retargeting is completely out of scope.

rdpeake commented 2 years ago

Thanks for the help so far - opted to continue the discussion here unless you feel this question needs a new issue:

When i try to load some of my glb's i get a validation error that an array element is empty. i am able to open these glb's in tools like babylonjs without any errors though. having decoded the gltf scheme out of the breaking files it seems they have an empty 'samplers' property on the root of the schema.

Is there anything i can do to get around the empty array as https://github.com/vpenades/SharpGLTF/blob/3c611c63f6f1cb2c9cd988faa4e01e53b11cc095/src/SharpGLTF.Core/IO/JsonSerializable.cs#L416 doesn't seem to be preventable via any settings i can see?

vpenades commented 2 years ago

In general, the glTF specification states that there shouldn't be any empty arrays in the json; Either the array is not empty, or there's no array at all. A gltf/glb with an empty array is considered as malformed.

Before anything else, check whether your GLB's are malformed with https://github.khronos.org/glTF-Validator/

If they're malformed, it's a problem of the exporter. On my side, I can mitigate the issue on case by case basis, but it's impossible to fix everything.

If the GLBs where exported by SharpGLTF, then it's an issue of the library. and I need to know which array is being reported as empty.

There was a known issue with the "Extras" filed, which is essentially a free used field. That allowed some users to fill empty arrays which where silently exported to json, so maybe that's the issue.

rdpeake commented 2 years ago

these were converted from FBX via facebooks fbx2gltf - so this would most likely be an issue on that exporters side, though it would be nice to have a way to mitigate it so it can work with legacy files whilst i correct the future pathway to not have this issue.

that validator does show it as having errors - the main one being the empty samplers array. thanks for the tool link.

vpenades commented 2 years ago

You should report the malformation issue to facebook's fbx2gltf so they can fix the issue on their end.

On my side, I will try to add the empty array issue in the "tryfix" validation mode, but not sure when I'll be able to do so.

rdpeake commented 2 years ago

I was able to use reflection to get the binary chunks, pre-process the json text to remove the empty array, and then reconstruct a readcontext with the binary chunk assigned, and then process the raw text.

not the greatest way of doing it (never a good idea to use reflection to access private fields) but it does allow me to process the malformed legacy glb's whilst i work out resolving them for future use.

                    var internalType = readContext.GetType().Assembly.GetType("SharpGLTF.Schema2._BinarySerialization");
                    var internalMethod = internalType.GetMethod("ReadBinaryFile", System.Reflection.BindingFlags.Static | System.Reflection.BindingFlags.Public);
                    var result = internalMethod.Invoke(null, new object[] { model.stream });

                    uint id = 0x4E4F534A;
                    var json = System.Text.Encoding.UTF8.GetString((result as IReadOnlyDictionary<uint, byte[]>)[id]);

                    uint binchunk = 0x004E4942;
                    var bin = (result as IReadOnlyDictionary<uint, byte[]>)[binchunk];

                    var obj = Newtonsoft.Json.JsonConvert.DeserializeObject<ExpandoObject>(json);

                    obj.Remove("samplers", out var val);

                    var jsonBytes = System.Text.Encoding.UTF8.GetBytes(Newtonsoft.Json.JsonConvert.SerializeObject(obj));

                    readContext = ReadContext.Create((f) => { Console.WriteLine(f); return Array.Empty<byte>(); });
                    readContext.Validation = SharpGLTF.Validation.ValidationMode.Skip;
                    readContext.GetType().GetField("_BinaryChunk", System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance).SetValue(readContext, bin);

                    passed = readContext.ReadTextSchema2(new MemoryStream(jsonBytes));

given there are no textures in this file, the lack of a valid return from the read context create method isn't an issue.

vpenades commented 2 years ago

Nice trick!

I will look into ways of fixing this kind of issues, but it will not be soon.

Ultimately, it's the exporters they should fix their export files, but I've noticed facebook's fbx2gltf is dead since 2 years ago, so we'll have to live with that.

rdpeake commented 2 years ago

yeah, i should have mentioned that as well - that was why i was looking for a work around as i'm hoping to be able to eliminate the facebook tool going forward, but even if i do that i still need to be able to use what i already have in place with the new toolchain. but thanks for the help and advice here. also any chance of getting a .net 6 compatable release, or is that a pain on your end?

vpenades commented 2 years ago

sharpgltf 1.0.0-alpha0025 is already net6

rdpeake commented 2 years ago

Female_Animation_TEST.zip This is a glb file which has a faulty samplers entry when put through the glb validator you linked before

rdpeake commented 2 years ago

Ah, i see you did a simpler option for your test case, liking the pre-post processor options (and the animation demo is handy as well)

vpenades commented 2 years ago

Yeah, the issue of invalid glTFs is a delicated thing; I want the library to be as useful as possible, but I don't want to take responsability of fixing every single bug other glTF exporters have and should fix on their end.

Also, I always thought that having a strict requirement on glTFs would enforce libraries to fix their bugs if they want their exported models to be loaded by other importers... it's a shame fbx2gltf has been abandoned; maybe an existing fork fixed these issues?

rdpeake commented 2 years ago

Even if it did fix the issues - the idea at the moment is to avoid changes to existing production items whilst expanding the new tool chain to be a much better system. As it stands it looks like we can use the blender exporter for glb's which wasn't an option for us back when we started - so we can cut out the fbx tool completely and use this library to merge multiple glb's together on request making the system much more robust to changes. just means we need to have support for the existing items long enough for clients to update out of using them