gdquest-demos / godot-3d-mannequin

An Open Source 3d character and character controller for the Godot game engine
Other
894 stars 73 forks source link

Make the rig easier to export for everyone #40

Closed NathanLovato closed 4 years ago

NathanLovato commented 4 years ago

Problem: Mannequiny was rigged via a Blender add-on, auto-rig pro.

This saved us a lot of time, the tool is great, but it has one limitation: you need the add-on, or at least the export code from the add-on to export an optimized version of the character for games.

We would like to find a way to make it easier for everyone to use the rig and export for games, without losing the animations.

How should we go about that? Is it necessary to redo the rig? Would it be possible to keep the current rig, separating the armature we need to export to the game engine from the control bones?

We would appreciate the insights of a professional game animator / rigger in the matter. What would be the best approach?

You can download the .blend file with the animated character here: https://github.com/GDQuest/godot-3d-mannequin/releases/tag/v0.3.0

fire commented 4 years ago

Here is a discussion log from Blender Vancouver discord.

<propersquid> So, to make sure I understand it correctly, the problem with the rig is you need the addon for the rig to work, right? <iFire> at 11:26 So I think it's a problem because the "internal rig" is baked out They don't feel comfortable rigging with that Not sure how to advise them. <propersquid> at 11:28 In my personal opinion, they'd probably need to rebuild the rig from scratch without any paid add-ons or rip out the parts that require the add-on. Assuming that they want a free rig that people can download and use. <iFire> at 11:29 Why can't they build on top of the exported file? it's already had the "internal paid addon" rig removed how important are controllers <propersquid> at 11:30 Depends on what's in the exported file. For animators: very important. <TwoPercentCool>> at 11:30 they are pretty important :p <iFire> at 11:31 You can use locators to fake it? I remember this was a thing with maya, not sure what it's called in Blender. <propersquid> at 11:31 Empties. <TwoPercentCool> at 11:31 not the best option in blender , better to leave the animation to the pose mode. empties are fixes not a generality . <propersquid> at 11:32 Yeah, it's best to have that as an armature. But, you could probably build a controller armature on top of the deformer armature. <TwoPercentCool> at 11:32 what he said --^ <iFire> at 11:32 The main problem is an armature that gets lost and animators depend on controllers (as mentioned) <TwoPercentCool> at 11:32 what do you mean? <propersquid> at 11:33 How is the animation being passed from animator to animator? <iFire> at 11:33 Like my proposed solution to them is take the exported gltf and import it back into Blender and add controllers and call it a day. <propersquid> at 11:34 Depending on how the rig was made, you may be able to re-pose the controller armature to get the exact (or close enough) pose from the deforming rig. Generally the best workflow is going from DCC -> Engine. That way you go from artist friendly to engine friendly. Depending on what's happening engine friendly is rarely ever artist friendly (for example, mesh triangulation). If people are expecting to go from Engine -> DCC, then that's going to require a lot of work to make sure the rig supports that. Like, that'd be hire a senior+ rigger for games to sort that out in my opinion. So, let's back up to the beginning of the issue. What's the workflow people are expecting? DCC -> Engine or Engine -> DCC? Also, if they were to create a version 2 of the rig, are they willing or able to break backwards compatibility? <iFire> at 11:40 The issue only mentioned DCC -> Engine. I believe they're willing to break compatibility because the task says redo the rig. <propersquid> at 11:42 Yeah, in that case, they can either do a full rebuild from scratch, or take the deformer rig and build the controllers on top of that. <iFire> at 11:43 Do you mind if I add your nicknames and summarize the conversation <propersquid> at 11:43 Go for it! At least, you have my permission to use mine. I'm also https://github.com/scott-wilson/ on GitHub.

NathanLovato commented 4 years ago

Thanks but that's stuff I already know. What I'd like to know is what would be most efficient and sustainable in the long run. In any case, you need to either:

I need people to recommend us the best option and maybe give us a cost estimate.

Note that we might be able to transfer animations again to the new rig using auto-rig pro.

scott-wilson commented 4 years ago

Here's my two cents on the 3 points:

I'm going to pass this thread to my rigger friends to give their feedback. I'm a pipeline TD at an animation studio, so while I can answer things from one perspective, I wouldn't be able to give you a definitive "this is the absolute best option" nor a cost estimate. Also, I'm likely missing some key information to make a suggestion.

verbal007 commented 4 years ago

Hey Nathan. I was asked to pop over here and and drop in my thoughts.

Intro/context : I've built rigging systems from scratch that had to (for example) work both in-game via Motionbuilder with animations directly on skeleton, and/or work in cinematics, via layered Maya rig. Also built a few rigging pipelines for fully animated features and TV series.

I feel there are many issues to the way rigging is done for many productions, such as the overuse of the "uber" rig. This the traditional character rig that tries to account for every situation that an animator may encounter. Of course, this is impossible, especially if they characters have to behave in a more acrobatic way. There are commonly used setups, such as ik/fk arms or the ability to have a limb follow or free itself from the orientation of the parent, but as many of these setups processes are scripted anyways... can this not be given to the animator?

I think some game studios (i.e. Bungie) have the right idea. In 2009, they called it "animating without a rig", meaning that the animation is applied directly to the bones that constrain/deform the mesh. Then the animators have a set of tools that allow them to create control constraints on the bones, customized case by case. Just now realizing that it has been David Hunt that has been pushing this method.

I.e. In one shot, the legs are not seen, so they don't both adding controls to the legs. In another shot, there is a lot of detail in the finger animations, so ik is placed on the fingers of the one hand. Yet, in another the character's motions are driven primarily by mocap data, but then blending to some hand-keyed animation.

If you tried to have all of that functionality built into a single uber rig, the rig would be difficult to support and terribly slow to animate with.

Some of my favorite presentations on this subject include:

image

2020 GDC talk that just came out a few days ago. Freeform Animation Rigging: Evolving the Animation Pipeline David Hunt - Unity https://youtu.be/XjMKbElVNmg

2015 - Tools-Based Rigging in Bungie's Destiny David Hunt again. https://www.youtube.com/watch?v=U_4u0kbf-JE

2009 - Modular Procedural Rigging David Hunt yet again https://halo.bungie.net/Inside/publications.aspx

2019 - SIGGRAPH 2019: Fast, interpolationless character animation through “ephemeral” rigging. I feel this "rigless" approach has lots of parallels with Raf Anzovin's work. https://www.youtube.com/watch?v=J48b0GKI4RM

Though Richard Lico is not a rigger, his animation examples often demonstrate the advantages of being able to create custom rigs per situation.

Unfortunately, I have been mostly working independently since 2009, so have not had a chance to implement these practices on a larger production pipeline, but much of my earlier work has also led me to these conclusions.

Nowadays, the majority of my work is not in Maya. I typically animate in Blender, directly onto a deformation skeleton and add "control rig" components, only as needed. This ensures I main flexibility and keeps everything running as real-time as it can be.

That all being said, if development time is low and you would prefer to stay within the "uber rig" paradigm, yet remove reliance on an external addon, perhaps it's best to use Blender's built-in Rigify, and then construct a tool that allows the animation to be exported over to a games-friendly deformation skeleton that has the animation directly applied/baked. i.e. https://docs.unity3d.com/560/Documentation/Manual/BlenderAndRigify.html

I hope this helps, and isn't too "theoretical".

NathanLovato commented 4 years ago

Thanks much for your feedback @scott-wilson and @verbal007.

if development time is low

That's our situation. We don't even have indie gamedev level budget, we're free software developers, sharing content with the community, and we make 0 from something like Mannequiny. We also have dozens of free software repositories like this one. So investing thousands of dollars on such a character, for example, is a lot for us. We have to work with the most efficient techniques

perhaps it's best to use Blender's built-in Rigify, and then construct a tool that allows the animation to be exported over to a games-friendly deformation skeleton that has the animation directly applied/baked.

That's exactly what auto-rig does for us. We moved away from Rigify because it's not designed at all with games in mind and would take quite a bit of export code. Auto-rig has this export code, available under the GPL-v3 license. It exports game-ready rigs, and we could technically extract that part and embed it in the Blender file. But it's not as easy as a copy-paste, it still involves quite a bit of work.

Freeform Animation Rigging: Evolving the Animation Pipeline

Regarding other workflows, we saw that and talked about that with a teammate already. I actually had Richard Lico as an animation instruction years ago, and he showed us the tools they used on... Halo 4 I think, in Maya. That's all great, but also beyond the scope of what we can do. We're not a company producing commercial games so we're quite limited.

scott-wilson commented 4 years ago

I just stumbled upon this. Not sure how useful this is or not, but it may answer some of the points that @verbal007 pointed out: https://gitlab.com/dypsloom/rigonthefly/-/blob/master/README.md

NathanLovato commented 4 years ago

That's interesting! This might allow us to do what @fire mentioned, using the exported armature, and working from it. 🙂

If it works, that would be an excellent solution.

NathanLovato commented 4 years ago

I just tried it, and it works great! I already love that workflow!

NathanLovato commented 4 years ago

Closed with 625dc34abd96761b774ebae312ca0bda992aa38b

I'll set up the release in a second with the new blend file.

verbal007 commented 4 years ago

Closed with 625dc34

I'll set up the release in a second with the new blend file.

Down with the "uber" rig! So fast! HAH! That's Awesome.

AnasASK commented 4 years ago

@NathanLovato did you create the original animations with the auto-rig pro addon? if so, how do you export them to glTF? Doesnt auto-rig pro only export to fbx? I am interested in this because of the great retargeting tool in auto-rig pro which can help add animations from mixamo.

NathanLovato commented 4 years ago

@anasofgo I exported to fbx and converted the fbx to gltf.