isl-org / Open3D

Open3D: A Modern Library for 3D Data Processing
http://www.open3d.org
Other
11.22k stars 2.28k forks source link

OBB #4935

Open mjm522 opened 2 years ago

mjm522 commented 2 years ago

Checklist

My Question

Hi all, I am trying to get a specific orientation for a mesh. For that, once I created the mesh, I multiply by the inverse of the current OBB orientation to make the current orientation unity. The position seems to get centred while the orientation returned is still the same. Is there a way to force or retain the orientation? This does not happen if the box created symmetrical. Is it because when I call get_oriented_bounding_box, it recalculates the convex-hull and then redo all the computation to get the orientation? Below is a code snippet that exhibits this. Probably the mesh is getting rotated, the question will be how to get its current orientation?

import numpy as np
import open3d as o3d

def get_mesh_pose(mesh):
    pose = np.eye(4)
    bb = mesh.get_oriented_bounding_box()
    pose[:3, :3] = bb.R
    pose[:3, 3] = bb.center 
    return pose

def create_mesh(width, height, depth):
    return o3d.geometry.TriangleMesh.create_box(width=width,
                                                    height=height,
                                                    depth=depth)

mesh = create_mesh(4, 3, 4)

curr_pose = get_mesh_pose(mesh)

print("current pose of the mesh")
print(repr(curr_pose))

print("product of the inverse")
print(np.linalg.inv(curr_pose).dot(curr_pose))

mesh.transform(np.linalg.inv(curr_pose))

print("after applying transformation")
print(repr(get_mesh_pose(mesh)))

new_pose = np.array([[ 0.33, -0.94, -0.  ,  0.2  ],
                    [-0.94, -0.33, -0.  , -1.  ],
                    [ 0.  ,  0.  , -1.  ,  0.  ],
                    [ 0.  ,  0.  ,  0.  ,  1.  ]])

mesh.transform(new_pose)

print("after applying new transformation")
print(repr(get_mesh_pose(mesh)))

The above code produces the following output.

current pose of the mesh
array([[ 1. ,  0. ,  0. ,  2. ],
       [ 0. ,  0. , -1. ,  1.5],
       [ 0. ,  1. ,  0. ,  2. ],
       [ 0. ,  0. ,  0. ,  1. ]])
product of the inverse
[[1. 0. 0. 0.]
 [0. 1. 0. 0.]
 [0. 0. 1. 0.]
 [0. 0. 0. 1.]]
after applying transformation
array([[ 0.,  1.,  0.,  0.],
       [ 1.,  0.,  0.,  0.],
       [ 0.,  0., -1.,  0.],
       [ 0.,  0.,  0.,  1.]])
after applying new transformation
array([[ 0. ,  1. ,  0. ,  0.2],
       [ 1. ,  0. ,  0. , -1. ],
       [ 0. ,  0. , -1. ,  0. ],
       [ 0. ,  0. ,  0. ,  1. ]])
yuecideng commented 2 years ago

Is it because when I call get_oriented_bounding_box, it recalculates the convex-hull and then redo all the computation to get the orientation

Yes, see this cpp implementation:

    Eigen::Vector3d mean;
    Eigen::Matrix3d cov;
    std::tie(mean, cov) = hull_pcd.ComputeMeanAndCovariance();

    Eigen::SelfAdjointEigenSolver<Eigen::Matrix3d> es(cov);
    Eigen::Vector3d evals = es.eigenvalues();
    Eigen::Matrix3d R = es.eigenvectors();

    if (evals(1) > evals(0)) {
        std::swap(evals(1), evals(0));
        Eigen::Vector3d tmp = R.col(1);
        R.col(1) = R.col(0);
        R.col(0) = tmp;
    }
    if (evals(2) > evals(0)) {
        std::swap(evals(2), evals(0));
        Eigen::Vector3d tmp = R.col(2);
        R.col(2) = R.col(0);
        R.col(0) = tmp;
    }
    if (evals(2) > evals(1)) {
        std::swap(evals(2), evals(1));
        Eigen::Vector3d tmp = R.col(2);
        R.col(2) = R.col(1);
        R.col(1) = tmp;
    }
    R.col(0) /= R.col(0).norm();
    R.col(1) /= R.col(1).norm();
    R.col(2) = R.col(0).cross(R.col(1));

The rotation matrix depends on the eigenvector of the covariance, and the order is rearranged based on the magnitude of eigenvalue. I suggest you define a rotation (or transformation) variable (maybe in numpy.ndarray 4x4) in the initial state, then you keep changing this variable once you apply transformation to the mesh, to record and represent the current state of pose.