robotology / whole-body-controllers

Simulink-based whole body controllers for humanoid robots.
116 stars 43 forks source link

Apply constraints on Kronecker product #69

Closed gabrielenava closed 4 years ago

gabrielenava commented 4 years ago

I've found this theory I've developed some months ago about applying constraints when computing the Kronecker product. in particular, we would like to find the matrix $X$ that satisfies the matrix equivalence:

$AXB = C$

when $X$ is subject to some constraints, such as:

Adding the theory here to avoid losing it.

gabrielenava commented 4 years ago

Consider a square matrix $X \in \mathbb{R}^{n \times n}$. Define the square matrix $X_d$ such that:

\begin{equation} X_d(i,j) = 0 \hspace{0.25cm} \forall i \neq j \\ \hspace{0.95cm} X_d(i,j) = X(i,j) \hspace{0.25cm} \forall i = j \end{equation}

Define also the square matrix $X_t = X^\top$.

The question is: how can I write analytically the mapping $X \to X_d$ and $X \to X_t$?

Answer:

The matrix $X_d$ can be obtained from $X$ as follows:

\begin{equation} Xd = \sum{i=1}^n E_i X E_i \end{equation}

where $E_i$ is a matrix of zeros, but the element $E_i(i,i)$ which is equal to one.

The matrix $X_t$ can be obtained from $X$ as follows:

\begin{equation} X_t = Xd + \sum{k=1}^{\frac{n(n-1)}{2}} E_{k} (X-Xd) E{k} \\ = \sum_{i=1}^n E_i X Ei + \sum{k=1}^{\frac{n(n-1)}{2}} E{k} (X-\sum{i=1}^n E_i X Ei) E{k} \end{equation}

where $E_{k}$ is a matrix of zeros, but the $k-th$ element on the upper diagonal matrix and its corresponding element on the lower diagonal matrix, which are equal to $1$.

Use case:

Imagine one is given with the following system: $AXB = C$; where $A \in \mathbb{R}^{p \times n}$, $B \in \mathbb{R}^{n \times m}$, $C \in \mathbb{R}^{p \times m}$. The matrix $X \in \mathbb{R}^{n \times n}$ is unknown.

The solution to the equation can be achieved by applying the Kronecker product, reducing the matrix equation to a linear system:

\begin{equation} Mx = c \end{equation}

where $M = (B^\top \otimes A)$, $x = vec(X)$, $c = vec(C)$. The solution $x = M^\dagger c$ is unconstrained, meaning that the resulting matrix $X$ can be whatever square matrix satisfies the equation. However, there might be situations in which we would like that matrix $X$ has specific properties, e.g. we may want that matrix $X$ is diagonal or symmetric.

In case we would like to have a diagonal matrix as result of the Kronecker product, we can modify the original equation as follows:

\begin{equation} AXdB = A(\sum{i=1}^n E_i X E_i)B = C \end{equation}

The problem can still be solved as before by inverting the linear relation $Mx = c$. However, $M$ must be defined as follows:

\begin{equation} M = \sum_{i=1}^n(E_iB)^\top \otimes (AE_i) \end{equation}

The case of a symmetric matrix is a bit more complicated. First of all, a symmetric matrix must satisfy $X = \frac{(X+X^\top)}{2}$. Recall that the transpose matrix can be written as: $Xt = \sum{i=1}^n E_i X Ei + \sum{k=1}^{\frac{n(n-1)}{2}} E{k} (X-\sum{i=1}^n E_i X Ei) E{k}$. Substitute the symmetry constraint in the original equation to get:

\begin{equation} C = AXdB = A\frac{(X+X^\top)}{2}B \\ = \frac{AXB}{2} + \frac{A(\sum{i=1}^n E_i X Ei + \sum{k=1}^{\frac{n(n-1)}{2}} E{k} (X-\sum{i=1}^n E_i X Ei) E{k})B}{2} \end{equation}

this complicated system can actually be reduced again to $Mx = c$ provided that $M$ must is defined as follows:

\begin{equation} M = \frac{(B^\top \otimes A)}{2} + \frac{\sum_{i=1}^n(E_iB)^\top \otimes (AEi)}{2} + \\ +\frac{(\sum{k=1}^{\frac{n(n-1)}{2}} (E_kB)^\top \otimes (AEk)}{2} + \\ -\sum{i=1}^n(\frac{\sum_{k=1}^{\frac{n(n-1)}{2}} (E_iE_kB)^\top \otimes (AE_kE_i)}{2} \end{equation}

Example with a $2 \times 2$ matrix

Consider the following matrix $X = \begin{bmatrix} 5 & 4 \ 6 & 13\end{bmatrix}$.