😽

Inverse matrix

2024/02/24に公開

Let A \in \R^{N \times N} represent a square matrix,

\begin{align*} A^{-1} A_m &= e_m \\ A^{-1} \begin{pmatrix} A_1 & A_2 & \dots & A_M \end{pmatrix} &= \begin{pmatrix} e_1 & e_2 & \dots & e_M \end{pmatrix} \end{align*}
\begin{align*} \end{align*}

\begin{align*} A^{-1} \left( \sum_{n=1}^N x_n A_n \right) = x ,\quad x \in \R^{N} \end{align*}

This is because

\begin{align*} A x = \sum_{n=1}^N x_n A_n \\ A^{-1} A x = A^{-1} \left( \sum_{n=1}^N x_n A_n \right) \\ x = A^{-1} \left( \sum_{n=1}^N x_n A_n \right) \\ \end{align*}

If A^{-1} exists, then any y \in \R^N can be expressed as a linear combination y = x_1 A_1 + \dots + x_N A_N = A x.

The weights x = \set{x_n}_{n=1}^N can be obtained using A^{-1} y.

This is because

\begin{align*} y = A x ,\\ A^{-1} y = A^{-1} A x ,\\ A^{-1} y = x . \\ \end{align*}

Discussion