Closed badaaim closed 2 years ago
Yes, these are files that we use to create the Skeletal Input animations for the drivers that Valve maintains (Vive Wand, Oculus Touch and Valve Index controllers). We do so by reading the state of the controllers and using that information to blend and combine these animations to create poses that match the pose of the users hand as closely as possible.
We don't provide any documentation on this system because it is not exposed through the driver API for third party developers to use. You are free to use the data file in you own drivers, though I recommend that you create copies of them in your own driver folder rather than reference them where they are, as I cannot guarantee that they won't get changed or removed as part of future updates.
Regarding your questions:
1) Yes, the bonemask files setting for per-bone blends. The order of the values matches the order of the bones in the hand skeleton defined in the docs on the OpenVR githhub: https://github.com/ValveSoftware/openvr/wiki/Hand-Skeleton
2) These are used by the driver to convert raw controller input into hand poses. The poses are then provided to game through the Skeletal Input API
3) The driver API is very simple: you just give it a list of transforms, one for each bone. How you come up with those transforms is up to you. If your input device provides a fully posed hand skeleton, you can just pass those transforms along (after converting them to OpenVR coordinates). If the input device provides more abstract information, like finger curl values or buttons & joysticks, then you'll need to create the bone transforms based on that information. We do this by blending and masking the set of animations in the files you find. You're free to use those in your own driver if it helps.
4) No direct documentation per-se. Blending animations to create performances is a fairly common practice in video games though, and there's a lot of literature available on how to do that. You can also get a feel for what's involved by playing around with some examples in existing game engines like Unity, Unreal, Godot, or even our own Source 2 engine that is available through the Half-Life:Alyx workshop.
Appreciate your feedback and information Joe, thank you.
Hi @joevdh,
We've stumbled upon information and resources that has helped us use the default anim_openclose.glb
from index and mapped the bone transforms to finger curl values.
Q1. Below is the understanding we have so far on some aspects with working on this system. Could you confirm whether this is correct and we are on the right path?
However, since we are using the skeleton pose/animation made for index, our controller inside VR environment is misaligned and offset to the physical world. We've made many attempts to rotate and change the position of the skeleton via Maya and thus modify the animation such that it aligns to our controllers but all attempts have been unsuccessful so far. Attached video 1: Index animation on custom controllers
Q2. Could you please confirm whether any of my hypothesis below is correct? Would you be able to provide some further pointers on what we should do to align our controllers to the skeleton or skeleton to the controllers?
I have a feeling that hypothesis 3 might be it. We have NOT matched the controllers to Vive controller as I found that as long as all relevant files refer to the the same and correct location coordinates and rotation values, we could get away with leaving our controllers in the origin (0,0,0) and rotation of (0,0,0) degrees (which made things easier on some aspect of the driver development).
Can you clarify how the animation data is being used? You have your own custom driver, right? Have you also made your own system for loading and blending animations to feed to the Skeletal Input System? Or are you somehow trying to use the animation blending code from the Index controller driver with your own driver?
Hi @joevdh ,
Apologies for late response.
Yes, we're doing this in our own driver. We've made our own system for loading the glb animation files and have been using the index controller open close animation as a reference for this.
Since our last response we also spotted that there was an option to set the base pose path to set the origin of the skeleton, so to offset we were wondering if we should use that and define our own origin relative to the rendermodel, or do this offset for the bones relative to the root bone in the animation file?
Thanks and looking forward to hearing from you.
Ah, ok. So the expectation is that each controller would define a location (offset from the controller) that acts as the "origin" for the hand animation. On the app side, this is the location that SteamVR will return when the app calls GetPoseActionData() with the VRActionHandle_t of the skeleton action. On the driver side, this is the the same 'pose' you set with the pchBasePosePath parameter passed to the IVRDriverInput::CreateSkeletonComponent() function. For the Index Controller, this Pose location is a centimeter or two in front of the top of the controller.
The app is responsible for taking the skeletal input you provide and placing it at this reference location. So that means that the animation data needs to also be relative to this location. The root bone of the skeletal animation data should have zero rotation or translation, and the wrist bone should be offset from the root such that it places the virtual hand at the same location as the user's real hand. The data should be authored this way in Maya or Blender, then exported and converted to the units and coordinate system of SteamVR to work properly. Also any scale needs to be baked down into the translation of the bones, since the system does not preserve any scale from the source data.
So with all this in mind, I think you should definitely define your own offset (location, pose, whatever you want to call it) from your rendermodel, and use that throughout your pipeline (from Maya or Blender, through to the driver). If you want to use the data from the index controller, then you'll need to set your offset such that the hand will line up with your controller. But unless your controller is the same shape as the Index controller then you'll probably want to make your own animations anyway.
Does that help?
Hi @joevdh
Thank you very much for the detailed feedback. That is very helpful indeed and confirms what we understand so far about the animation files and how it works.
In the index controller's driver folder, there are skeleton animations for close squeeze and also poses for trackpad movements and similarly button presses. I used Blender to open and visualise this. Context We have our own custom hardware which is working and would like to provide finger tracking and animations for. I have used Blender to export fbx files with close squeeze skeleton animation with below things in mind for our specific hardware and it's form factor. a. Valve's default VR glove in mind
Steam\steamapps\common\SteamVR\resources\rendermodels\vr_glove\vr_glove_right_model.fbx
b. Used the default skeleton provided in the steamVR resources folderSteam\steamapps\common\SteamVR\resources\skeletons\vr_glove_right_skeleton.fbx
c. Used the steamVR's coordinate space systemQuestion