When calling the model with batched inputs (batch_size > 1), it fails returning the error shown below.
File "/home/schaefer/.miniconda3/envs/hpe3d/lib/python3.7/site-packages/smplx/body_models.py", line 376, in forward
self.lbs_weights, pose2rot=pose2rot, dtype=self.dtype)
File "/home/schaefer/.miniconda3/envs/hpe3d/lib/python3.7/site-packages/smplx/lbs.py", line 205, in lbs
J_transformed, A = batch_rigid_transform(rot_mats, J, parents, dtype=dtype)
File "/home/schaefer/.miniconda3/envs/hpe3d/lib/python3.7/site-packages/smplx/lbs.py", line 347, in batch_rigid_transform
rel_joints.view(-1, 3, 1)).view(-1, joints.shape[1], 4, 4)
RuntimeError: view size is not compatible with input tensor's size and stride (at least one dimension spans across two contiguous subspaces). Use .reshape(...) instead.
I think an easy fix would be to contiguous then joint tensor J, something like
When calling the model with batched inputs (batch_size > 1), it fails returning the error shown below.
I think an easy fix would be to contiguous then joint tensor
J
, something likeThank you very much :)