ValveSoftware / openvr

OpenVR SDK
http://steamvr.com
BSD 3-Clause "New" or "Revised" License
6.12k stars 1.28k forks source link

[Question] How to reference skeleton animation files for pose and finger tracking #1552

Closed badaaim closed 2 years ago

badaaim commented 3 years ago

In the index controller's driver folder, there are skeleton animations for close squeeze and also poses for trackpad movements and similarly button presses. I used Blender to open and visualise this. anims folder for index controller Context We have our own custom hardware which is working and would like to provide finger tracking and animations for. I have used Blender to export fbx files with close squeeze skeleton animation with below things in mind for our specific hardware and it's form factor. a. Valve's default VR glove in mind Steam\steamapps\common\SteamVR\resources\rendermodels\vr_glove\vr_glove_right_model.fbx b. Used the default skeleton provided in the steamVR resources folder Steam\steamapps\common\SteamVR\resources\skeletons\vr_glove_right_skeleton.fbx c. Used the steamVR's coordinate space system

Question

  1. Is my understanding correct in that the bonemask_xxxx.txt files are used to either enable/disable specific individual bones based on which one is relevant for output animation? Anyone have any details or documentation on this?
  2. Is the hand skeletal animations in the fbx files used for default skeleton finger tracking or only for game specific poses?
  3. I understand that in the creation of custom driver, we can attach device analog input to skeletal transforms for finger tracking. How does this differ to the animation in skeleton.fbx files? Or are we able to reference the fbx files for the default skeleton attachment and finger tracking?
  4. Is there any references, examples or documentation on how to use different pose animations in the fbx for thumb "tracking"? Example being, for a Vive controller when finger is moved in trackpad in a clock-wise manner, the virtual thumb also follows the same trajectory. Or when the system button is pressed, the virtual thumb translates to the same position inside the VR environment.
joevdh commented 3 years ago

Yes, these are files that we use to create the Skeletal Input animations for the drivers that Valve maintains (Vive Wand, Oculus Touch and Valve Index controllers). We do so by reading the state of the controllers and using that information to blend and combine these animations to create poses that match the pose of the users hand as closely as possible.

We don't provide any documentation on this system because it is not exposed through the driver API for third party developers to use. You are free to use the data file in you own drivers, though I recommend that you create copies of them in your own driver folder rather than reference them where they are, as I cannot guarantee that they won't get changed or removed as part of future updates.

Regarding your questions: 1) Yes, the bonemask files setting for per-bone blends. The order of the values matches the order of the bones in the hand skeleton defined in the docs on the OpenVR githhub: https://github.com/ValveSoftware/openvr/wiki/Hand-Skeleton 2) These are used by the driver to convert raw controller input into hand poses. The poses are then provided to game through the Skeletal Input API 3) The driver API is very simple: you just give it a list of transforms, one for each bone. How you come up with those transforms is up to you. If your input device provides a fully posed hand skeleton, you can just pass those transforms along (after converting them to OpenVR coordinates). If the input device provides more abstract information, like finger curl values or buttons & joysticks, then you'll need to create the bone transforms based on that information. We do this by blending and masking the set of animations in the files you find. You're free to use those in your own driver if it helps.
4) No direct documentation per-se. Blending animations to create performances is a fairly common practice in video games though, and there's a lot of literature available on how to do that. You can also get a feel for what's involved by playing around with some examples in existing game engines like Unity, Unreal, Godot, or even our own Source 2 engine that is available through the Half-Life:Alyx workshop.

badaaim commented 3 years ago

Appreciate your feedback and information Joe, thank you.

badaaim commented 2 years ago

Hi @joevdh,

We've stumbled upon information and resources that has helped us use the default anim_openclose.glb from index and mapped the bone transforms to finger curl values.

Q1. Below is the understanding we have so far on some aspects with working on this system. Could you confirm whether this is correct and we are on the right path?

  1. FBX2glTF converter, specifically version 0.93/94 where using any releases newer than that gives us the WRONG output because they seem to have made some changes to how it converts scaling and rotation factors
  2. The fbx used for import/export and storing these skeleton animations is specific to Maya 2018 where using an alternative program like Blender is NOT compatible for various reasons. (different core architecture and fbx support for Blender being relatively new in comparison)

However, since we are using the skeleton pose/animation made for index, our controller inside VR environment is misaligned and offset to the physical world. We've made many attempts to rotate and change the position of the skeleton via Maya and thus modify the animation such that it aligns to our controllers but all attempts have been unsuccessful so far. Attached video 1: Index animation on custom controllers

Q2. Could you please confirm whether any of my hypothesis below is correct? Would you be able to provide some further pointers on what we should do to align our controllers to the skeleton or skeleton to the controllers?

  1. Whatever we are doing to the skeleton is the incorrect way to make changes to the skeleton, even though it may look right inside Maya.
  2. We are making the changes to the skeleton correctly but we are outputting a corrupt fbx/glb file and thus SteamVR defaults to the Index/Vive skeleton/animation file
  3. Regardless whether hypothesis 1 and 2 is true and "fixed", the root bone/origin of the skeleton file is "static" and cannot be changed which is why all the controller render models are MATCHED or overlapped to the Vive controller for reference as instructed in the "the_render_model" guide for making your own VR hardware. The root bone being different to the rest of the child bones which are dynamic and CAN be moved/changed for the animation

    I have a feeling that hypothesis 3 might be it. We have NOT matched the controllers to Vive controller as I found that as long as all relevant files refer to the the same and correct location coordinates and rotation values, we could get away with leaving our controllers in the origin (0,0,0) and rotation of (0,0,0) degrees (which made things easier on some aspect of the driver development).

joevdh commented 2 years ago

Can you clarify how the animation data is being used? You have your own custom driver, right? Have you also made your own system for loading and blending animations to feed to the Skeletal Input System? Or are you somehow trying to use the animation blending code from the Index controller driver with your own driver?

badaaim commented 2 years ago

Hi @joevdh ,

Apologies for late response.

Yes, we're doing this in our own driver. We've made our own system for loading the glb animation files and have been using the index controller open close animation as a reference for this.

Since our last response we also spotted that there was an option to set the base pose path to set the origin of the skeleton, so to offset we were wondering if we should use that and define our own origin relative to the rendermodel, or do this offset for the bones relative to the root bone in the animation file?

Thanks and looking forward to hearing from you.

joevdh commented 2 years ago

Ah, ok. So the expectation is that each controller would define a location (offset from the controller) that acts as the "origin" for the hand animation. On the app side, this is the location that SteamVR will return when the app calls GetPoseActionData() with the VRActionHandle_t of the skeleton action. On the driver side, this is the the same 'pose' you set with the pchBasePosePath parameter passed to the IVRDriverInput::CreateSkeletonComponent() function. For the Index Controller, this Pose location is a centimeter or two in front of the top of the controller.

The app is responsible for taking the skeletal input you provide and placing it at this reference location. So that means that the animation data needs to also be relative to this location. The root bone of the skeletal animation data should have zero rotation or translation, and the wrist bone should be offset from the root such that it places the virtual hand at the same location as the user's real hand. The data should be authored this way in Maya or Blender, then exported and converted to the units and coordinate system of SteamVR to work properly. Also any scale needs to be baked down into the translation of the bones, since the system does not preserve any scale from the source data.

So with all this in mind, I think you should definitely define your own offset (location, pose, whatever you want to call it) from your rendermodel, and use that throughout your pipeline (from Maya or Blender, through to the driver). If you want to use the data from the index controller, then you'll need to set your offset such that the hand will line up with your controller. But unless your controller is the same shape as the Index controller then you'll probably want to make your own animations anyway.

Does that help?

badaaim commented 2 years ago

Hi @joevdh

Thank you very much for the detailed feedback. That is very helpful indeed and confirms what we understand so far about the animation files and how it works.