UPC-ViRVIG / MMVR

Repository for the SCA 2022 paper "Combining Motion Matching and Orientation Prediction to Animate Avatars for Consumer-Grade VR Devices"
https://upc-virvig.github.io/MMVR/
Other
61 stars 7 forks source link

Few doubts? #2

Closed tejaswiniiitm closed 2 years ago

tejaswiniiitm commented 2 years ago

Hi @JLPM22 ,

Thanks for the wonderful project! I have gone through the paper, and code of this project to understand. Am trying to train and generate DBs for new motion data. I have some doubts reg. the project: (Don't mind for asking all doubts at once! Doing so, so that it saves both of our times..! 🙂)

  1. Can we record animations (i.e., preparing .bvh files as per what I understood) with perception neuron motion capture, instead with Xsens, and use it for the data of this project?

  2. Does this project work with just one controller+HMD data instead of two controllers+HMD? Since my scenario is a game where both hands hold only one controller. If not, what changes need to be made majorly, to work with single controller+HMD?

  3. DB for training data of MLP (2.5 hrs motion capture), and Motion matching DBs(5 mins of mocap data) are different. In what sequence are they needed to be prepared, and what scripts to use to generate these 2 datasets?

    • I found a process to generate pose, feature dataset here. But didn't exactly get how the final motion matching DB to be able to search during inference can be prepared!
  4. Animations folder in the given data are bvh files originally. If there are two animations like dancing step 1 & dancing step 2, do they need to be recorded as 2 animation files, i.e., 2 bvh files? But your data has less bvh files or I am bit confused!

Thank you!

JLPM22 commented 2 years ago

Hello @tejaswiniiitm ! Thanks for your interest in the project :) I'll reply to question per question:

  1. As long as that motion capture (or with another software) can produce a .bvh file it should be fine. In some cases, if the Y and Z axis are flipped (or other axes), the character may appear walking on a different plane (not the ground), which will require preprocessing with another software.

  2. It should work with minor changes. In the GameObject VRCharacterController in its component VRDirectionPredictor change the reference to the Left and Right controllers to the same one. Then in the GameObject Avatar, you will have to update the ArmIK components from FinalIK, so they have the same controller as the target.

  3. Sorry I did not properly document this due to lack of time :(. Data is defined using Unity ScriptableObjects (you can create them by right click in Unity's project window, Create/MotionMatching/MotionMatchingData or Create/MotionSynthesis/MotionSynthesisData, or just duplicate the examples) For the motion matching data, you can follow the process in the other repository. Once you press Generate Databases then you can reference the "MotionMatchingData" file in the MotionMatchingController like in the demo scene, then the MotionMatchingController automatically finds the generated databases. For the training data it is similarly defined, but using the MotionSynthesisData scriptable object. The training code in python uses two of them (lines 43 and 44 of train_direction.py), one for testing and another for training. You can find them in Unity with the names TestMSData and TrainingMSData. You can modify them and Generate Databases, or create new ones and update the reference in the VRDirectionPredictor component.

  4. I'm not sure I understand this question. But, the way Motion Matching works typically each database contains a lot of animations (one or multiple .bvh files) from the same style, then Motion Matching chooses the best poses at runtime. However, you can have two databases each one using a different .bvh and then change these databases at runtime to change the dancing style.

Hope it's more clear now! If I should clarify any step more in-depth, let me know!

tejaswiniiitm commented 2 years ago

Hi @JLPM22 Thanks for so detailed and quick reply ! Will try training and running on new data, following the above steps.

tejaswiniiitm commented 2 years ago

Hi @JLPM22 , I am trying to do the Quickstart part mentioned in the README.

  1. When I was running the Demo scene, only lower body was animating accordingly, upper body was not. I have imported FinalIK into my project. When I was checking in the scene, there were 2 missing scripts for Avatar object in the scene. Are they related to FinalIK ? What to do (or attach components) for upper body also to animate accordingly? You can reply using the screenshots if that's easy for you!.

image

  1. In the lower body, walking and turning around is working fine. Squatting or crouching is not working. Think the different height DB is not referenced in the project to use! I found that different DBs are prepared for different height ratios in the paper. And also found MotionMatchingSquatData in MMData/Data folder, but not referenced/used. I found the option to attach squat dataset as in the below pic, but no squat .asset was generated ! So, how to make it working for different heights?

image

Thankyou !

JLPM22 commented 2 years ago

Hello @tejaswiniiitm !

  1. It seems to be an issue depending on the version of Final IK. I updated the Readme with pictures of the FinalIK's ArmIK scripts, click here.

  2. I have updated the project to include leg bending (squat, crouching, tiptoes) as in the paper. You may need to pull the project or clone it again. I have added additional instructions in the Quick Start section. I have also updated the Data (.bvh files and MMData) with the leg bending data, you may need to download it again.

Thank you for the feedback! Hope my answers help :)

tejaswiniiitm commented 2 years ago

Hi @JLPM22 Ok followed as you said, working fine now. Thanks!