PeizhuoLi / walk-the-dog-unity

Unity framework for motion alignment across different morphologies with no supervision [SIGGRAPH 2024]
https://peizhuoli.github.io/walkthedog/
20 stars 0 forks source link

Setup for Custom Datasets #1

Open kai-lan opened 3 days ago

kai-lan commented 3 days ago

Can I have more instructions on how to visualize motion matching results for two custom datasets?

So I trained with two custom datasets (human A and human B). In the processed data folders, I have Sequences.txt, Description.txt and Data.bin. In the output folder of training script, I have Manifolds_0_*.npz, Manifolds_1_*.npz, VQ.npz, yles100_*_*.onnix. From running offline_motion_matching.py, I have replay_sequence.npz.

I can visualize the pretrained human-dog motion matching following the instructions. What should I change to visualize my custom data? I tried importing HumanA/Sequeces.txt and its corresponding Manifold_0_final.npz into editor-human in the Asserts/Projects/DeepPhase/Demos/Retargeting/mm_human2dog scene, and the same for HumanB into editor-dog. Although Human A is running, I don't see HumanB`.

Any advice is appreciated!

PeizhuoLi commented 2 days ago

Visualizing motion matching should be possible even without importing any manifold because replay_sequence.npz simply records which frame from the dataset of Human B should be record in which frame. Is there any error message from Unity showing up when you use your customized data?

kai-lan commented 1 day ago

Thank you for your reply.

So now I am able to visualize the motion matching results. A few more question: when choosing a target_id for offline motion matching, does it match a certain subsequence or one entire .bvh file for the input dataset? Does replay_sequences.npz contain motion matching data for one subsequence, or one .bvh file, or for all sequences of the input dataset? Is it possible visualize motion matching results for all sequences?