Open icedwater opened 1 month ago
Hey @icedwater, The indexing seems to be broken, I fixed it in my fork a while ago (https://github.com/JasonNero/diffusion-motion-inbetweening/commit/8689999828c8d1f7ef6f19d1c15fe16b4dadcaae).
If you prefer BVH files, I recommend looking at joints2bvh.py
from MoMask or originally "A Deep Learning Framework For Character Motion Synthesis and Editing".
This takes the SMPL joint positions, converts them to joint rotations via Inverse Kinematics and saves those to a BVH file.
Thanks @JasonNero! I had a comment saved in my other machine about the indexing but other tasks got in the way :)
I'm not sure if it's because something in the generation process added an extra dimension to the ndarray somehow, but I was able to make a little bit of progress by replacing ['motion']
with ['motion'][0]
in lines 15 and 26. (Probably also need to do this in 33.) However, this led to another issue later on in render_mesh, where npy2obj.real_num_frames
is no longer an int/int array. So I amended ['length']
to ['length'][0]
in line 34 as well.
If you prefer BVH files, I recommend looking at
joints2bvh.py
from MoMask ...
I'm not sure how the joints2bvh.py
can be used for the results.npy
or the sample00_rep00_smpl_params.npy
format, because the joints sequence isn't obviously represented in either of those outputs. Do you have an idea of what needs to be extracted or converted from there?
Heyhey,
you can just load the results.npy
and pass the joint positions from results["motion"]
to the converter.convert(...)
function. Both projects use the SMPL skeleton so they have the same joint order.
I might have slightly different shapes than you, but I hope this helps you anyways:
path = Path("./path/to/your/results.npy")
results = np.load(path, allow_pickle=True).item()
motion = results["motion"]
converter = Joint2BVHConvertor()
for i_rep in range(results["num_repetitions"]):
for i_sample in range(results["num_samples"]):
joints = motion[i_rep, i_sample].transpose(2, 0, 1) # (J, 3, F) -> (F, J, 3)
file_name = f"sample{i_sample:02d}_rep{i_rep:02d}"
bvh_path = path.parent / (file_name + ".bvh")
converter.convert(joints, bvh_path, foot_ik=False)
Thanks! I might have taken much longer to figure out the transposition of the motion on my own. I was finally able to export a usable BVH. I've cleaned up the main() bit of joints2bvh.py
in my fork of momask, and convert_one_result()
works fine for me.
Hi @JasonNero, I see that you have forked this repo and I was wondering if you support custom inputs? (ps. your repo doesn't allow to create issues)
Hi @kbrodt what specifically do you mean by custom inputs? I'm trying to figure out how to train on new rigs which are not AMASS-based (e.g., 30-joint skeletons, 50-joint, etc).
I have 2 (or more) poses in SMPL format (21 joint rotations + 1 global orientation) and I want to inbetween.
I'm trying to generate some animation-friendly data from the inferred .npy, such as .obj but preferably .bvh or .fbx.
After following the suggested command, I get the error below. Is there a problem with the way the data is stored in the numpy ndarray?
I checked the shape of the output and it is 5 dimensions, but if I add a
_
in front of line 18 to ignore the first variable, I run into a different error:Is there something I'm missing about the dimensions of the data?