English flag Arrows French flag
Moon Arrows Sun
Current
Arrows
Other

The properties of matrices

For what follows, it is important to establish the following definitions:

  1. Operations on matrices
    1. Matrices addition
    2. Let \((A,B) \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})^2\) be two matrices of the same size.

      $$ \forall (i, j) \in [\![1, n]\!] \times [\![1, p]\!],$$
      $$(A + B)_{i,j} = a_{i,j} + b_{i,j} $$

      In other words, we add each element of the left matrix with the element located at the same position of the right one:

      $$ A + B = \begin{pmatrix} a_{1,1} & a_{1,2} & a_{1,3} & \dots & a_{1, p} \\ a_{2,1} & a_{2,2} & a_{2,3} & \dots & a_{2, p} \\ \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \ddots & \hspace{0.5em} \vdots \\ a_{n,1} & a_{n,2} & a_{n,3} & \dots & a_{n, p} \end{pmatrix} + \begin{pmatrix} b_{1,1} & b_{1,2} & b_{1,3} & \dots & b_{1, p} \\ b_{2,1} & b_{2,2} & b_{2,3} & \dots & b_{2, p} \\ \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \ddots & \hspace{0.5em} \vdots \\ b_{n,1} & b_{n,2} & b_{n,3} & \dots & b_{n, p} \end{pmatrix} $$
      $$ A + B = \begin{pmatrix} a_{1,1} + b_{1,1} & a_{1,2} + b_{1,2} & a_{1,3} + b_{1,3} & \dots & a_{1, p} + b_{1, p} \\ a_{2,1} + b_{2,1} & a_{2,2} + b_{2,2} & a_{2,3} + b_{2,3} & \dots & a_{2, p} + b_{2,p} \\ \hspace{2em} \vdots & \hspace{2em} \vdots & \hspace{2em} \vdots & \ddots & \hspace{2em} \vdots \\ a_{n,1} + b_{n,1} & a_{n,2} + b_{n,2} & a_{n,3} + b_{n,3} & \dots & a_{n, p} + b_{n,p} \end{pmatrix} $$
    3. Matrices product
    4. Let \(A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})\) and \(B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K})\) be two matrices.

      To multiply two matrices, we need the left matrix to have the same number of columns as the number of rows of the right one (here \(p\)). As a result, we obtain a matrix \(AB \in \hspace{0.03em} \mathcal{M}_{n,q} (\mathbb{K})\), so having \(n\) lines and \(q\) columns.

      $$ \forall (i, j) \in [\![1, n]\!] \times [\![1, q]\!],$$
      $$(A \times B)_{i,j} = \sum_{k = 1}^p a_{i,k} \times b_{k,j} $$

      For example:

      $$ A \times B = \begin{pmatrix} a_{1,1} & a_{1,2} & a_{1,3} & \dots & a_{1, p} \\ a_{2,1} & a_{2,2} & a_{2,3} & \dots & a_{2, p} \\ \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \ddots & \hspace{0.5em} \vdots \\ a_{n,1} & a_{n,2} & a_{n,3} & \dots & a_{n, p} \end{pmatrix} \times \begin{pmatrix} b_{1,1} & b_{1,2} & b_{1,3} & \dots & b_{1, q} \\ b_{2,1} & b_{2,2} & b_{2,3} & \dots & b_{2, q} \\ \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \ddots & \hspace{0.5em} \vdots \\ b_{p,1} & b_{p,2} & b_{p,3} & \dots & b_{p, q} \end{pmatrix} $$
      $$ A \times B = \begin{pmatrix} \Bigl[a_{1,1} b_{1,1} + a_{1,2} b_{2,1} \ + \ ... \ + \ a_{1,p} b_{p,1} \Bigr] & \Bigl[a_{1,1} b_{1,2} + a_{1,2} b_{2,2} \ + \ ... \ + \ a_{1,p} b_{p,2}\Bigr] & \hspace{1em} \dots \dots \dots \hspace{1em} & \Bigl[a_{1,1} b_{1,q} + a_{1,2} b_{2,q} \ + \ ... \ + \ a_{1,p} b_{p,q}\Bigr] \\ \Bigl[a_{2,1} b_{1,1} + a_{2,2} b_{2,1} \ + \ ... \ + \ a_{2,p} b_{p,1}\Bigr] & \Bigl[a_{2,1} b_{1,2} + a_{2,2} b_{2,2} \ + \ ... \ + \ a_{2,p} b_{p,2}\Bigr] & \hspace{1em} \dots \dots \dots \hspace{1em} & \Bigl[a_{2,1} b_{1,q} + a_{2,2} b_{2,q} \ + \ ... \ + \ a_{2,p} b_{p,q}\Bigr] \\ \hspace{8em} \vdots & \hspace{8em} \vdots & \hspace{1em} \ddots & \hspace{8em} \vdots \\ \hspace{8em} \vdots & \hspace{8em} \vdots & \hspace{1em} \ddots & \hspace{8em} \vdots \\ \Bigl[a_{n,1} b_{1,1} + a_{n,2} b_{2,1} \ + \ ... \ + \ a_{n,p} b_{p,1}\Bigr] & \Bigl[a_{n,1} b_{1,2} + a_{2,2} b_{2,2} \ + \ ... \ + \ a_{n,p} b_{p,2}\Bigr] & \hspace{1em} \dots \dots \dots \hspace{1em} & \Bigl[a_{n,1} b_{1,q} + a_{n,2} b_{2,q} \ + \ ... \ + \ a_{n,p} b_{p,q}\Bigr] \end{pmatrix} $$

      Be careful, in a general way the matrices product does not have commutative law: \( (A \times B) \neq (B \times A) \).

    5. Multiplication of a matrix by a scalar \(\lambda\)
    6. Let \(A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})\) be a matrix.

      When a matrix is multiplied by a scalar, it affects all its elements.

      $$ \forall (i, j) \in [\![1, n]\!] \times [\![1, p]\!],$$
      $$(\lambda A)_{i,j} = \lambda \ a_{i,j} $$

      For example:

      $$ A = \begin{pmatrix} a_{1,1} & a_{1,2} & a_{1,3} & \dots & a_{1, p} \\ a_{2,1} & a_{2,2} & a_{2,3} & \dots & a_{2, p} \\ \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \ddots & \hspace{0.5em} \vdots \\ a_{n,1} & a_{n,2} & a_{n,3} & \dots & a_{n, p} \end{pmatrix} $$
      $$ \lambda A = \begin{pmatrix} \lambda \ a_{1,1} & \lambda \ a_{1,2} & \lambda \ a_{1,3} & \dots & \lambda \ a_{1, p} \\ \lambda \ a_{2,1} & \lambda \ a_{2,2} & \lambda \ a_{2,3} & \dots & \lambda \ a_{2, p} \\ \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \hspace{0.5em} \vdots & \ddots & \hspace{0.5em} \vdots \\ \lambda \ a_{n,1} & \lambda \ a_{n,2} & \lambda \ a_{n,3} & \dots & \lambda \ a_{n, p} \end{pmatrix} $$
    7. Linear combination of matrices
    8. Let \((A,B) \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})^2\) be two matrices of the same size and \((\lambda, \mu) \in \hspace{0.04em} \mathbb{R}^2\).

      With the previous properties of addition and multiplication by a scalar, we can create linear combinations and:

      $$(\lambda A + \mu B)_{i,j} = \lambda \ a_{i,j} + \mu \ b_{i,j} $$
    9. Matrix transposition
    10. Let \(A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})\) be a squared matrix of size \(n\).

      Matrix transposition consists in reverse lines and columns indices for each elements. We note \(A^T\) (sometimes \(^t A\)) the transposed of the matrix \(A\).

      $$ \forall (i, j) \in [\![1, n]\!]^2,$$
      $$ \left(A^T \right)_{i,j} \hspace{0.03em} = a_{j,i} $$

      For example:

      $$ A = \begin{pmatrix} a_{1,1} & \textcolor{#8A5757}{a_{1,2}} & \textcolor{#8A5757}{a_{1,3}} & \textcolor{#8A5757}{\dots} & \textcolor{#8A5757}{a_{1, n}} \\ \textcolor{#58814B}{a_{2,1}} & a_{2,2} & \textcolor{#8A5757}{a_{2,3}} & \textcolor{#8A5757}{\dots} & \textcolor{#8A5757}{a_{2, n}} \\ \textcolor{#58814B}{a_{3,1}} & \textcolor{#58814B}{a_{3,2}} & a_{3,3} & \textcolor{#8A5757}{\dots} & \textcolor{#8A5757}{a_{3, n}} \\ \hspace{1em} \textcolor{#58814B}{\vdots} & \hspace{1em} \textcolor{#58814B}{\vdots} & \hspace{1em} \textcolor{#58814B}{\vdots} & \ddots & \hspace{1em} \textcolor{#8A5757}{\vdots} \\ \textcolor{#58814B}{a_{n,1}} & \textcolor{#58814B}{a_{n,2}} & \textcolor{#58814B}{a_{n,3}} & \textcolor{#58814B}{\dots} & a_{n, n} \\ \end{pmatrix} $$

      So, its transposed is:

      $$ A^T = \begin{pmatrix} a_{1,1} & \textcolor{#58814B}{a_{2,1}} & \textcolor{#58814B}{a_{3,1}} & \textcolor{#58814B}{\dots} & \textcolor{#58814B}{a_{n, 1}} \\ \textcolor{#8A5757}{a_{1,2}} & a_{2,2} & \textcolor{#58814B}{a_{3,2}} & \textcolor{#58814B}{\dots} & \textcolor{#58814B}{a_{n, 2}} \\ \textcolor{#8A5757}{a_{1,3}} & \textcolor{#8A5757}{a_{2,3}} & a_{3,3} & \textcolor{#58814B}{\dots} & \textcolor{#58814B}{a_{n, 3}} \\ \hspace{0.8em} \textcolor{#8A5757}{\vdots} & \hspace{0.8em} \textcolor{#8A5757}{\vdots} & \hspace{0.8em} \textcolor{#8A5757}{\vdots} & \ddots & \hspace{0.8em} \textcolor{#58814B}{\vdots} \\ \textcolor{#8A5757}{a_{1,n}} & \textcolor{#8A5757}{a_{2,n}} & \textcolor{#8A5757}{a_{3,n}} & \textcolor{#8A5757}{\dots} & a_{n, n} \\ \end{pmatrix} $$

      Only the diagonal remains intact, because when \(i = j\), then \(a_{i,j} = a_{j,i}\).

    11. Inversion of a matrix
    12. Let \(A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})\) be a matrix.

      The inverse of the matrix \(A\) is the matrix written \(A^{-1}\) and the same size, such as: \(A A^{-1} = I_n\).


      To check if a matrix can be inverted, we do have to compute its determinant, and:

      $$ A \text{ is inversible } \Longleftrightarrow det(A) \neq 0 $$
    13. Comatrix
    14. Let \(A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})\) be a squared matrix of size \(n\).

      The comatrix of the matrix \(A\) is the matrix noted \(com(A)\), such as:

      $$ \forall (i, j) \in [\![1, n]\!]^2,$$
      $$ com(A)_{i,j} \hspace{0.03em} = C_{i,j} $$
      $$ \text{où } \ \left \{ \begin{gather*} C_{i,j} : \text{cofactors of the element } a_{i, j} \\ M_{i, j} : \text{minor of the element } a_{i, j} \end{gather*} \right \} $$

      \(C_{i, j}\) : cofactors of the element \(a_{i, j}\)

      $$ C_{i,j} = (-1)^{i + j} \times det(M_{i, j}) $$

      \(M_{i, j}\) : minor of the element \(a_{i, j}\)

      The minor of \(a_{i, j}\) is the undermatrix of \(A\) without the line \(i\) and the column \(j\).

      For example, starting from the following matrix \(A\), the minor \(\textcolor{#6F79AB}{M_{1,1}}\) appears in blue:

      $$ A = \begin{pmatrix} a_{1,1} & a_{1,2} & a_{1,3} & \dots & a_{1, p} \\ a_{2,1} & \textcolor{#6F79AB}{a_{2,2}} & \textcolor{#6F79AB}{a_{2,3}} & \dots & \textcolor{#6F79AB}{a_{2, p}} \\ \hspace{0.5em} \vdots & \hspace{0.5em} \textcolor{#6F79AB}{\vdots} & \hspace{0.5em} \textcolor{#6F79AB}{\vdots} & \textcolor{#6F79AB}{\ddots} & \hspace{0.5em} \textcolor{#6F79AB}{\vdots} \\ a_{n,1} & \textcolor{#6F79AB}{a_{n,2}} & \textcolor{#6F79AB}{a_{n,3}} & \dots & \textcolor{#6F79AB}{a_{n, p}} \end{pmatrix} $$

      So,

      $$ \textcolor{#6F79AB}{ M_{1,1} = \begin{pmatrix} a_{2,2} & a_{2,3} & \dots & a_{2, p} \\ \vdots & \hspace{0.5em} \vdots & \ddots & \hspace{0.5em} \textcolor{#6F79AB}{\vdots} \\ a_{n,2} & a_{n,3} & \dots & a_{n, p} \end{pmatrix} } $$

      For example, starting from the following matrix \(A\):

      $$ A = \begin{pmatrix} a_{1,1} & a_{1,2} & a_{1,3} \\ a_{2,1} & a_{2,2} & a_{2,3} \\ a_{3,1} & a_{3,2} & a_{3,3} \end{pmatrix} $$

      Its comatrix is worth:

      $$ com(A) = \begin{pmatrix} \textcolor{#58814B}{+}\begin{vmatrix} a_{2,2} & a_{2,3} \\ a_{3,2} & a_{3,3} \end{vmatrix} & \textcolor{#8A5757}{-}\begin{vmatrix} a_{2,1} & a_{2,3} \\ a_{3,1} & a_{3,3} \end{vmatrix} & \textcolor{#58814B}{+}\begin{vmatrix} a_{2,1} & a_{2,2} \\ a_{3,1} & a_{3,2} \end{vmatrix} \\ \textcolor{#8A5757}{-}\begin{vmatrix} a_{1,2} & a_{1,3} \\ a_{3,2} & a_{3,3} \end{vmatrix} & \textcolor{#58814B}{+}\begin{vmatrix} a_{1,1} & a_{1,3} \\ a_{3,1} & a_{3,3} \end{vmatrix} & \textcolor{#8A5757}{-}\begin{vmatrix} a_{1,1} & a_{1,2} \\ a_{3,1} & a_{3,2} \end{vmatrix} \\ \textcolor{#58814B}{+}\begin{vmatrix} a_{1,2} & a_{1,3} \\ a_{2,2} & a_{2,3} \end{vmatrix} & \textcolor{#8A5757}{-}\begin{vmatrix} a_{1,1} & a_{1,3} \\ a_{2,1} & a_{2,3} \end{vmatrix} & \textcolor{#58814B}{+}\begin{vmatrix} a_{1,1} & a_{1,2} \\ a_{2,1} & a_{2,2} \end{vmatrix} \end{pmatrix} $$
    15. Matricial writing of a system of linear equations
    16. A system of linear equations \( (S)\), where the unknown are the variables \(x_{i,j}\), can be written as a product matrix system :

      $$ (S) \enspace \left \{ \begin{gather*} a_1 x_{1,1} + a_2 x_{1,2} + a_3 x_{1,3} + \hspace{0.1em}... \hspace{0.1em}+ a_n x_{1,p} = b_1 \\ a_1 x_{2,1} + a_2 x_{2,2} + a_3 x_{2,3} + \hspace{0.1em}... \hspace{0.1em}+ a_n x_{2,p} = b_2 \\ ........................ ............. \ = \ ..\\ a_1 x_{n,1} + a_2 x_{n,2} + a_3 x_{n,3} + \hspace{0.1em}... \hspace{0.1em}+ a_n x_{n,p} = b_n \\ \end{gather*} \right \} $$
      $$ \Longleftrightarrow$$
      $$ \underbrace{ \begin{pmatrix} x_{1,1} & x_{1,2} & x_{1,3} & \dots & x_{1, p} \\ x_{2,1} & x_{2,2} & x_{2,3} & \dots & x_{2, p} \\ \hspace{0.8em} \vdots & \hspace{0.8em} \vdots & \hspace{0.8em} \vdots & \ddots & \hspace{0.8em} \vdots \\ x_{n,1} & x_{n,2} & x_{n,3} & \dots & x_{n, p} \\ \end{pmatrix} } _\text{X} \times \underbrace{ \begin{pmatrix} a_1 \\ a_2 \\ \hspace{0.3em}\vdots \\ a_n \end{pmatrix} } _\text{A} = \underbrace{ \begin{pmatrix} b_1 \\ b_2 \\ \hspace{0.3em}\vdots \\ b_n \end{pmatrix} } _\text{B} \ \Longleftrightarrow \ MA = B, \ \text{with} \enspace \left \{ \begin{gather*} X \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) \\ A \in \hspace{0.03em} \mathcal{M}_{1,p} (\mathbb{K}) \\ B \in \hspace{0.03em} \mathcal{M}_{1,p} (\mathbb{K}) \end{gather*} \right \} $$
    17. Trace of a matrix
    18. Let \(A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})\) be a squared matrix of size \(n\).

      We call the trace of a matrix, the sum of all its diagonal elements:

      $$ A = \begin{pmatrix} \textcolor{#6F79AB}{a_{1,1}} & a_{1,2} & a_{1,3} & \dots & a_{1,n} \\ a_{2,1} & \textcolor{#6F79AB}{a_{2,2}} & a_{2,3} & \dots & a_{2,n} \\ a_{3,1} & a_{3,2} & \textcolor{#6F79AB}{a_{3,3}} & \dots & a_{3,n} \\ \hspace{0.1em}\vdots & \hspace{0.1em} \vdots & \hspace{0.1em} \vdots & \textcolor{#6F79AB}{\ddots} & \hspace{0.1em} \vdots \\ a_{n,1} & a_{n,2} & a_{n,3} & \dots & \textcolor{#6F79AB}{a_{n,n}} \end{pmatrix} $$
      $$Tr(A) = \sum_{k = 1}^n a_{k,k} = a_{1,1} + a_{2,2} \ + \ ... \ + a_{n,n}$$
  2. Specific matrices
    1. Diagonal matrix
    2. A diagonal is a squared matrix where all the elements are \(0\) except on the main diagonal:

      $$ D_n = \begin{pmatrix} \textcolor{#6F79AB}{d_{1,1}} & 0 & 0 & \dots & 0 \\ 0 & \textcolor{#6F79AB}{d_{2,2}} & 0 & \dots & 0 \\ 0 & 0 & \textcolor{#6F79AB}{d_{3,3}} & \dots & 0 \\ \hspace{0.1em}\vdots & \hspace{0.1em} \vdots & \hspace{0.1em} \vdots & \textcolor{#6F79AB}{\ddots} & \hspace{0.1em} \vdots \\ 0 & 0 & 0 & \dots & \textcolor{#6F79AB}{d_{n,n}} \end{pmatrix} $$
      $$ \forall (i, j) \in [\![1, n]\!]^2, \ (i \neq j) \Longrightarrow d_{i,j} = 0$$

      We also note the diagonal matrix \(D_n\) only in relation with its diagonal elements : \(D_n = diag(\lambda_1, \lambda_2, \ ..., \lambda_n)\).

    3. Identity matrices
    4. The identity matrices \(I_n\) are defined as follows:

      $$ I_n = \begin{pmatrix} \textcolor{#6F79AB}{1} & 0 & 0 & \dots & 0 \\ 0 & \textcolor{#6F79AB}{1} & 0 & \dots & 0 \\ 0 & 0 & \textcolor{#6F79AB}{1} & \dots & 0 \\ \vdots & \vdots & \vdots & \textcolor{#6F79AB}{\ddots} & \vdots \\ 0 & 0 & 0 & \dots & \textcolor{#6F79AB}{1} \\ \end{pmatrix} $$

      It is the matrix of size \(n\) having the value \(1\) on its main diagonal, and \(0\) everywhere else. It's a specific case of diagonal matrix. For example,

      $$ I_3 = \begin{pmatrix} \textcolor{#6F79AB}{1} & 0 & 0 \\ 0 & \textcolor{#6F79AB}{1} & 0 \\ 0 & 0 & \textcolor{#6F79AB}{1} \end{pmatrix} $$

Matrix product
Associativity
$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), \ \forall C \in \hspace{0.03em} \mathcal{M}_{q,r} (\mathbb{K}), $$
$$ (A \times B) \times C = A \times (B \times C) $$
Distributivity
$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall (B, C) \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K})^2, $$
$$ A \times (B + C) = A \times B + A \times C $$
$$ \forall (A,B) \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})^2 , \ \forall C \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ (A + B) \times C = A \times C + B \times C $$
Bilinearity
$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ (\lambda A) \times B = A \times (\lambda B) = \lambda (A \times B) $$
Multiplication by the identity
$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}),$$
$$ I_n \times A = A \times I_p = A $$
Diagonal product matrix
  1. Product of two diagonal matrix
  2. $$ \forall \Bigl[ D_1 = diag(\lambda_1, \lambda_2, \ ..., \lambda_n), \ D_2 = diag(\mu_1, \mu_2, \ ..., \mu_n) \Bigr] \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2, $$
    $$ D_1 \times D_2 = D_2 \times D_1 = diag \left(\lambda_1 \mu_1, \lambda_2 \mu_2, \ ..., \lambda_n \mu_n \right) $$
  3. A diagonal matrix raised to the power of \(n\)
  4. $$ \forall \Bigl[ D = diag(\lambda_1, \lambda_2, \ ..., \lambda_n) \Bigr] \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}), $$
    $$ D^m = diag \left(\lambda_1^m, \lambda_2^m, \ ..., \lambda_n^m \right) $$
Matrix comatrix, transposed and determinant
$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}), \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ A \times com(A)^T = com(A)^T \times A = det(A) \times I_n $$
$$(5)$$
Matrix transposition
Linearity of transposition
$$ \forall (\lambda, \mu) \in \hspace{0.04em} \mathbb{R}^2, \ \forall (A,B) \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})^2, $$
$$ (\lambda A + \mu B)^T = \lambda A^T + \mu B^T $$
Transposed of a product
$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ (A \times B)^T = B^T \times A^T $$
$$(6)$$
Inversion of matrix
Computation of l'inverse
$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}), \ det(A) \neq 0, $$
$$ A^{-1} = \frac{1}{det(A)} \times com(A)^T $$
Inverse of the inverse
$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}),$$
$$ A \text{ is inversible } \Longrightarrow A^{-1} \text{ is inversible } \Longrightarrow (A^{-1})^{-1} = A $$
Inverse of a transposed matrix
$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}),$$
$$ A \text{ is inversible } \Longrightarrow A^{T} \text{ is inversible } \Longrightarrow \ \left(A^T \right)^{-1} = (A^{-1})^T$$
Inverse of a product
$$ \forall (A ,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2,$$
$$ A \text{ and } B \text{ are inversible } \Longrightarrow (A \times B) \text{ is inversible } \Longrightarrow \ \left(A \times B\right)^{-1} = B^{-1} \times A^{-1} $$
$$(9)$$

Both expressions \((9)\) and \((10)\) have the same behaviour:

$$ \forall (A ,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2, \enspace \Biggl \{ \begin{gather*} (A \times B)^T = B^T \times A^T \hspace{1em}\qquad (9) \\ \left(A \times B\right)^{-1} = B^{-1} \times A^{-1} \qquad (10) \end{gather*} $$

So, the order of transposition or inversion has no importance,

$$ \forall (A ,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2,$$
$$ \left((A \times B)^T \right)^{-1} = \hspace{0.03em} \left((A \times B)^{-1} \right)^T = \hspace{0.03em} \left(A^T\right)^{-1} \times \hspace{0.04em} \left(B^T\right)^{-1} = \hspace{0.03em} \left(A^{-1}\right)^T \times \hspace{0.04em} \left(B^{-1}\right)^T $$
Traces of matrix
Linearity of the trace
$$ \forall (\lambda \mu) \in \hspace{0.04em} \mathbb{R}^2, \ \forall (A,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2,$$
$$ Tr(\lambda A + \mu B) = \lambda \ Tr(A) + \mu \ Tr(B) $$
Trace of a product
$$ \forall (A,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2,$$
$$ Tr(A \times B) = Tr(B \times A)$$
Recap table of the properties of matrices

Demonstrations

Matrix product

Associativity

Let be \(A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})\), \(B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K})\) et \(C \in \hspace{0.03em} \mathcal{M}_{q,r} (\mathbb{K})\) three matrices.

  1. Computation of \((A \times B) \times C\)
  2. By definition, we do have:

    $$ \forall (i, j) \in [\![1, n]\!] \times [\![1, r]\!],$$
    $$\Bigl( (A \times B) \times C \Bigr)_{i,j} = \ \sum_{k = 1}^q (ab)_{i,k} \times c_{k,j} $$

    But, the factor \((ab)_{i,k}\) is worth:

    $$ \forall (i, k) \in [\![1, n]\!] \times [\![1, q]\!],$$
    $$ (ab)_{i,k} = \sum_{l = 1}^p a_{i,l} \times b_{l,k} $$

    So, we replace it in the main expression and:

    $$\Bigl( (A \times B) \times C \Bigr)_{i,j} = \ \sum_{k = 1}^q \left[ \sum_{l = 1}^p a_{i,l} \times b_{l,k} \right] \times c_{k,j} $$

    Since the factor \(c_{k,j}\) is independent from \(l\), it can be considered as a constant, and integrated inside the inner sum.

    $$\Bigl( (A \times B) \times C \Bigr)_{i,j} = \ \sum_{k = 1}^q \sum_{l = 1}^p \Bigl[ a_{i,l} \times b_{l,k} \times c_{k,j} \Bigr] \qquad (1) $$
  3. Computation of \( A \times (B \times C)\)
  4. Let us now calculate the product \(A \times (B \times C)\).

    $$ \forall (i, j) \in [\![1, n]\!] \times [\![1, r]\!],$$
    $$\Bigl( A \times (B \times C) \Bigr)_{i,j} = \ \sum_{k = 1}^p a_{i,k} \times (bc)_{k,j} $$

    In the same way, we replace \((bc)_{k,j}\) by its expression and:

    $$\Bigl( A \times (B \times C) \Bigr)_{i,j} = \ \sum_{k = 1}^p a_{i,k} \times \left[ \sum_{l = 1}^q b_{k,l} \times c_{l,j} \right] $$
    $$\Bigl( A \times (B \times C) \Bigr)_{i,j} = \ \sum_{k = 1}^p \sum_{l = 1}^q \Bigl[ a_{i,k} \times b_{k,l} \times c_{l,j} \Bigr] \qquad (2) $$

    In both expression \((1)\) and \((2)\), the variables \(k\) and \(l\) are free variables:

    $$\Bigl( (A \times B) \times C \Bigr)_{i,j} = \ \sum_{k = 1}^q \sum_{l = 1}^p \Bigl[ a_{i,l} \times b_{l,k} \times c_{k,j} \Bigr] \qquad (1) $$
    $$\Bigl( A \times (B \times C) \Bigr)_{i,j} = \ \sum_{k = 1}^p \sum_{l = 1}^q \Bigl[ a_{i,k} \times b_{k,l} \times c_{l,j} \Bigr] \qquad (2) $$

    Therefore, they can be interverterd, and now \((1)\) and \((2)\) are equal and:

    $$\Bigl( (A \times B) \times C \Bigr)_{i,j} = \Bigl( A \times (B \times C) \Bigr)_{i,j} = \ \sum_{k = 1}^p \sum_{l = 1}^q \Bigl[ a_{i,l} \times b_{l,k} \times c_{k,j} \Bigr] $$

And finally,

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), \ \forall C \in \hspace{0.03em} \mathcal{M}_{q,r} (\mathbb{K}), $$
$$ (A \times B) \times C = A \times (B \times C) $$

Distributivity

  1. Left distributivity
  2. Let \(A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})\) be a matrix and \((B, C) \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K})^2\) two matrices.

    With the definition of the product matrix, we do have:

    $$ \forall (i, j) \in [\![1, n]\!] \times [\![1, q]\!],$$
    $$\Bigl( A \times (B + C) \Bigr)_{i,j} = \ \sum_{k = 1}^p \Bigl[ a_{i,k} \times (b + c)_{k,j}\Bigr] $$
    $$\Bigl( A \times (B + C) \Bigr)_{i,j} = \ \sum_{k = 1}^p \Bigl[ a_{i,k} \times (b_{k,j} + c_{k,j})\Bigr]$$
    $$\Bigl( A \times (B + C) \Bigr)_{i,j} = \ \sum_{k = 1}^p \Bigl[ a_{i,k} \times b_{k,j} + a_{i,k} \times c_{k,j} \Bigr]$$
    $$\Bigl( A \times (B + C) \Bigr)_{i,j} = \ \sum_{k = 1}^p \Bigl[ a_{i,k} \times b_{k,j}\Bigr] + \sum_{k = 1}^p \Bigl[a_{i,k} \times c_{k,j}\Bigr] $$

    And finally,

    $$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall (B, C) \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K})^2, $$
    $$ A \times (B + C) = A \times B + A \times C $$
  3. Right distributivity
  4. Let \((A, B) \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})^2 \) be two matrices and \( C \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K})\) an other matrix.

    As well as before:

    $$ \forall (i, j) \in [\![1, n]\!] \times [\![1, q]\!],$$
    $$\Bigl( (A + B) \times C \Bigr)_{i,j} = \ \sum_{k = 1}^p \Bigl[(a + b)_{i,k} \times c_{k,j}\Bigr] $$
    $$\Bigl( (A + B) \times C \Bigr)_{i,j} = \ \sum_{k = 1}^p \Bigl[(a_{i,k} + b_{i,k}) \times c_{k,j}\Bigr] $$
    $$\Bigl( (A + B) \times C \Bigr)_{i,j} = \ \sum_{k = 1}^p \Bigl[a_{i,k} \times c_{k,j} + b_{i,k} \times c_{k,j}\Bigr] $$
    $$\Bigl( (A + B) \times C \Bigr)_{i,j} = \ \sum_{k = 1}^p \Bigl[a_{i,k} \times c_{k,j}\Bigr] + \sum_{k = 1}^p \Bigl[ b_{i,k} \times c_{k,j}\Bigr] $$

    And finally,

    $$ \forall (A,B) \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})^2 , \ \forall C \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
    $$ (A + B) \times C = A \times C + B \times C $$

Bilinearity

Let \(A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})\) and \(B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K})\) be two matrices and \(\lambda \in \mathbb{R}\) a reel number.

With the definition of the product matrix, we do have:

$$ \forall (i, j) \in [\![1, n]\!] \times [\![1, q]\!],$$
$$(\lambda A \times B)_{i,j} = \sum_{k = 1}^p \lambda a_{i,k} \times b_{k,j} = \lambda \sum_{k = 1}^p a_{i,k} \times b_{k,j} $$

In the same way:

$$( A \times \lambda B)_{i,j} = \sum_{k = 1}^p a_{i,k} \times \lambda b_{k,j} = \lambda \sum_{k = 1}^p a_{i,k} \times b_{k,j} $$

And as a result,

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ (\lambda A) \times B = A \times (\lambda B) = \lambda (A \times B) $$

Multiplication by the identity

Let \(A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})\) be a matrix.

  1. Computation of \(I_n \times A\)
  2. With the definition of the product matrix, we do have:

    $$ \forall (i, j) \in [\![1, n]\!] \times [\![1, p]\!],$$
    $$(I_n \times A)_{i,j} = \sum_{k = 1}^n (I_n)_{i,k} \times a_{k,j} $$

    But the factor \((I_n)_{i,k}\) is worth :

    $$ (I_n)_{i,k} = \Biggl \{ \begin{gather*} 1, \ if \ (i = k) \\ 0 \ \text{otherwise} \end{gather*} $$

    So,

    $$(I_n \times A)_{i,j} = a_{i,j} = (A)_{i,j} $$

    It is the unchanged starting matrix.

  3. Computation of \(A \times I_p\)
  4. Idem, on the other side:

    $$ \forall (i, j) \in [\![1, n]\!] \times [\![1, p]\!],$$
    $$(A \times I_p)_{i,j} = \sum_{k = 1}^p a_{i,k} \times (I_p)_{k,j} $$

    In the same way and for all \((i,j)\), in this sum of products, when \((k = j)\) we obtain \(a_{i,j}\) since all other terms are worth \(0\) and therefore:

    $$(A \times I_p)_{i,j} = a_{i,j} = (A)_{i,j} $$

And as a result,

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}),$$
$$ I_n \times A = A \times I_p = A $$

Diagonal product matrix

  1. Product of two diagonal matrix
  2. Let \(\Bigl[ D_1 = diag(\lambda_1, \lambda_2, \ ..., \lambda_n), \ D_2 = diag(\mu_1, \mu_2, \ ..., \mu_n) \Bigr] \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2 \) be two diagonal matrix.

    With the definition of the product matrix, we do have:

    $$ \forall (i, j) \in [\![1, n]\!]^2,$$
    $$(D_1 \times D_2)_{i,j} = \sum_{k = 1}^n (d_1)_{i,k} \times (d_2)_{k,j} $$

    Retrieving the definition of a diagonal matrix, in each inner product of all these sums:

    $$ \Biggl \{ \begin{gather*} \forall (i, k) \in [\![1, n]\!]^2, \ (i \neq k) \Longrightarrow (d_1)_{i,k} = 0 \\ \forall (k, j) \in [\![1, n]\!]^2, \ (k \neq j) \Longrightarrow (d_2)_{k,j} = 0 \end{gather*} $$

    So, for any \(k\), the product \( \Bigl[ (d_1)_{i,k} \times (d_2)_{k,j} \Bigr] \neq 0 \) only if:

    $$ \Bigl[ (i = k) \land (k = j) \Bigr] \Longleftrightarrow (i = k = j) $$

    We then have:

    $$ \forall (i, j) \in [\![1, n]\!]^2, \ (D_1 \times D_2)_{i,j} = \Biggl \{ \begin{gather*} (d_1)_{i,j} \times (d_2)_{i,j}, \ si \ (i = j) \\ 0 \ \text{otherwise} \end{gather*} $$

    So,

    $$ \forall (i, j) \in [\![1, n]\!]^2, \ (D_1 \times D_2)_{i,j} = \Biggl \{ \begin{gather*} (d_1)_{k,k} \times (d_2)_{k,k} = \lambda_k \ \mu_k, \ si \ (i = j = k) \\ 0 \ \text{otherwise} \end{gather*} $$

    $$ (D_1 \times D_2) = \begin{pmatrix} \textcolor{#6F79AB}{\lambda_1 \ \mu_1} & 0 & 0 & \dots & 0 \\ 0 & \textcolor{#6F79AB}{\lambda_2 \ \mu_2} & 0 & \dots & 0 \\ 0 & 0 & \textcolor{#6F79AB}{\lambda_3 \ \mu_3} & \dots & 0 \\ \hspace{0.1em}\vdots & \hspace{0.1em} \vdots & \hspace{0.1em} \vdots & \textcolor{#6F79AB}{\ddots} & \hspace{0.1em} \vdots \\ 0 & 0 & 0 & \dots & \textcolor{#6F79AB}{\lambda_n \ \mu_n} \end{pmatrix} $$

    In the same way, if we perform the product on the other way round:

    $$ \forall (i, j) \in [\![1, n]\!]^2,$$
    $$(D_2 \times D_1)_{i,j} = \sum_{k = 1}^n (d_2)_{i,k} \times (d_1)_{k,j} $$

    This same reasoning leads us to the same result, that is to say that:

    $$ \forall (i, j) \in [\![1, n]\!]^2, \ (D_1 \times D_2)_{i,j} = \Biggl \{ \begin{gather*} (d_2)_{k,k} \times (d_1)_{k,k} = \mu_k \ \lambda_k, \ si \ (i = j = k) \\ 0 \ \text{otherwise} \end{gather*} $$

    The product of numbers on the field \(\mathbb{K}\) having commutative law, both results of the products \((D_1 \times D_2)\) and \((D_2 \times D_1)\) are equal.

    And finally,

    $$ \forall \Bigl[ D_1 = diag(\lambda_1, \lambda_2, \ ..., \lambda_n), \ D_2 = diag(\mu_1, \mu_2, \ ..., \mu_n) \Bigr] \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2, $$
    $$ D_1 \times D_2 = D_2 \times D_1 = diag \left(\lambda_1 \mu_1, \lambda_2 \mu_2, \ ..., \lambda_n \mu_n \right) $$
  3. A diagonal matrix raised to the power of \(n\)
  4. Let \(\Bigl[ D = diag(\lambda_1, \lambda_2, \ ..., \lambda_n) \Bigr] \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}) \) be a diagonal matrix.

    Following the same reasoning as above, by a direct recurrence we do obtain as a result that:

    $$ \forall \Bigl[ D = diag(\lambda_1, \lambda_2, \ ..., \lambda_n) \Bigr] \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}), $$
    $$ D^m = diag \left(\lambda_1^m, \lambda_2^m, \ ..., \lambda_n^m \right) $$

Matrix comatrix, transposed and determinant

Let \(A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})\) be a matrix.

  1. Computation of \( \bigl[A \times com(A)^T \bigr]\)
  2. If we do perform the product \(A \times com(A)^T\), we obtain that:

    $$ \forall (i, j) \in [\![1, n]\!]^2,$$
    $$ \left(A \times com(A)^T \right)_{i,j} = \sum_{k = 1}^n a_{i,k} \times C_{j,k} $$

    And as a result:

    $$ A \times com(A)^T = det(A) \times I_n $$
  3. Computation of \( \bigl[com(A)^T \times A\bigr]\)
  4. If we now commute both factors in the previous product such as:

    $$ \left(com(A)^T \times A \right)_{i,j} = \sum_{k = 1}^n C_{k,i} \times a_{k,j} $$

    Then, in the same way as before:

    And as well as above, the product matrix will be worth zero everywhere else.

    $$ com(A)^T \times A = \begin{pmatrix} \textcolor{#6F79AB}{det(A)} & 0 & 0 & \dots & 0 \\ 0 & \textcolor{#6F79AB}{det(A)} & 0 & \dots & 0 \\ 0 & 0 & \textcolor{#6F79AB}{det(A)} & \dots & 0 \\ \hspace{0.1em}\vdots & \hspace{0.1em} \vdots & \hspace{0.1em} \vdots & \textcolor{#6F79AB}{\ddots} & \hspace{0.1em} \vdots \\ 0 & 0 & 0 & \dots & \textcolor{#6F79AB}{det(A)} \end{pmatrix} $$

    And as a result:

    $$ com(A)^T \times A = det(A) \times I_n $$

We finally proved that :

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ A \times com(A)^T = com(A)^T \times A = det(A) \times I_n $$
$$(5)$$

Matrix transposition

Linearity of transposition

Let \((A,B) \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})^2\) be two matrices of the same size and \((\lambda, \mu) \in \hspace{0.04em} \mathbb{R}^2\) two real numbers.

We saw with the multiplication of a matrix by a scalar all its elements were affected.

Moreover, the matrix sum (of the same size) is the addition of elements of both matrix having index \((i,j)\) together.

Now, with these two properties, we can build a linear combination such as:

$$ \forall (i, j) \in [\![1, n]\!] \times [\![1, p]\!],$$
$$(\lambda A + \mu B)_{i,j} = \lambda \ a_{i,j} + \mu \ b_{i,j}$$

Now taking its transposed matrix, in reverses all \(i\) and \(j\) index:

$$(\lambda A + \mu B)^T_{i,j} = \lambda \ a_{j,i} + \mu \ b_{j,i}$$

And as a result,

$$ \forall (\lambda, \mu) \in \hspace{0.04em} \mathbb{R}^2, \ \forall (A,B) \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})^2, $$
$$ (\lambda A + \mu B)^T = \lambda A^T + \mu B^T $$

Transposed of a product

Let \(A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K})\) and \(B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K})\) be two matrices.

With the definition of the product matrix, we do have:

$$ \forall (i, j) \in [\![1, n]\!] \times [\![1, q]\!],$$
$$(A \times B)_{i,j} = \sum_{k = 1}^p a_{i,k} \times b_{k,j} $$

Now taking its transposed matrix, in reverses all \(i\) and \(j\) index:

$$ \forall (i, j) \in [\![1, q]\!] \times [\![1, n]\!],$$
$$(A \times B)^T_{i,j} = \sum_{k = 1}^p a_{j,k} \times b_{k,i} $$

But, the product of transposed is worth the same value:

$$(B^T \times A^T)_{i,j} = \sum_{k = 1}^p b_{k,i} \times a_{j,k} $$

So,

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ (A \times B)^T = B^T \times A^T $$
$$(6)$$

Inversion of matrix

Computation of the inverse

Starting from the following expression \((5)\) :

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ A \times com(A)^T = com(A)^T \times A = det(A) \times I_n \qquad (5) $$

By taking only the right part of the expression, we do have this:

$$ com(A)^T \times A = det(A) \times I_n $$

Then, by multiplying each side by \(A^{-1}\):

$$ com(A)^T \times A \textcolor{#8A5757}{\times A^{-1}} = det(A) \times I_n \textcolor{#8A5757}{\times A^{-1}} $$
$$ com(A)^T = det(A) \times I_n \times A^{-1} $$

Finally, by dividing each memmber by \(det(A)\), we do obtain tahat:

$$ \textcolor{#8A5757}{\frac{1}{det(A)}} \times com(A)^T = \frac{det(A)}{\textcolor{#8A5757}{det(A)}} \times I_n \times A^{-1} $$

And finally,

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}), \ det(A) \neq 0, $$
$$ A^{-1} = \frac{1}{det(A)} \times com(A)^T $$

Inverse of the inverse

Let \(A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})\) a squared matrix of size \(n\).

The relationship between an inversible matrix and its determinant is:

$$ A \text{ is inversible } \Longleftrightarrow det(A) \neq 0 $$

But we do also have this property:

$$ det(A^{-1}) = det(A)^{-1}$$

Si, if \(A\) is inversible, then \(A^{-1}\) also is. We now have the following relation:

$$ A A^{-1} = I_n$$

But also :

$$ A^{-1} (A^{-1})^{-1} = I_n$$

By multiplying each member of this expression by \(A\) from the left, we do obtain:

$$ \underbrace {A A^{-1}} _\text{ \(= \ I_n\)} \ (A^{-1})^{-1} = \ \underbrace {A I_n} _\text{ \(= \ A\)}$$

And finally,

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}),$$
$$ A \text{ is inversible } \Longrightarrow A^{-1} \text{ is inversible } \Longrightarrow (A^{-1})^{-1} = A $$

Inverse of a transposed matrix

Let \(A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})\) a squared matrix of size \(n\).

Matrix\(A\) is inversible if and only if \(det(A) \neq 0\). But both matrix \(A\) and \(A^T\) have the same determinant:

$$ det(A) = det(A^T)$$

Then, if \(A\) is inversible, then \(A^T\) also is.

Furthermore, we saw that:

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ (A \times B)^T = B^T \times A^T $$

So, applied to our case:

$$ \left(A \times A^{-1}\right)^T = \ \left(A^{-1} \right)^T \times \ A^T $$
$$ I_n^T = \ \left(A^{-1} \right)^T \times \ A^T $$

Now, the transposed matrix of the matrix identity is invariant. Therefore:

$$ I_n = \ \left(A^{-1} \right)^T \times \ A^T $$

By multiplying each member of this expression by \(\left(A^T \right)^{-1}\) from the right, we do obtain:

$$ I_n \times \left(A^T \right)^{-1} = \ \left(A^{-1} \right)^T \times \ \underbrace { A^T \left(A^T \right)^{-1}} _\text{ \(= \ I_n\)} \ $$

And as a result,

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}),$$
$$ A \text{ is inversible } \Longrightarrow A^{T} \text{ is inversible } \Longrightarrow \ \left(A^T \right)^{-1} = (A^{-1})^T$$

Inverse of a product

Let \((A,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})\) be two squared matrices of the same size \(n\).

If both matrix \(A\) and \(B\) are inversible, then:

$$ A \text{ and } B \text{ are inversible } \Longleftrightarrow \Biggl \{ \begin{gather*} det(A) \neq 0 \\ det(B) \neq 0 \end{gather*} \qquad(7) $$

But, by the properties of the determinant, we know that:

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n,p} (\mathbb{K}) , \ \forall B \in \hspace{0.03em} \mathcal{M}_{p,q} (\mathbb{K}), $$
$$ det(A \times B) = det(A) \times det(B) \qquad(8) $$

Combining both expressions \((7)\) and \((8)\), we now have:

$$ A \text{ and } B \text{ are inversible } \Longleftrightarrow det(A \times B) \neq 0$$

Therefore, the product \((A \times B)\) is also inversible and:

$$ (AB) \times \ \left(AB\right)^{-1} \hspace{0.01em} = I_n$$

By multiplying each member of this expression by \((B^{-1} A^{-1})\) from the left, we do obtain:

$$ (B^{-1} A^{-1}) \times (AB) \times \hspace{0.01em} \left(AB\right)^{-1} \hspace{0.01em} = (B^{-1} A^{-1}) \times I_n $$
$$ B^{-1} \times (A^{-1} A) \times B \times \hspace{0.01em} \left(AB\right)^{-1} \hspace{0.01em} = B^{-1} A^{-1}$$

Moreover, we saw above that:

$$ \forall A \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K}),$$
$$ A \text{ is inversible } \Longrightarrow A^{-1} \text{ is inversible } \Longrightarrow (A^{-1})^{-1} = A $$
$$ B^{-1} \times \ \underbrace{ \left(A^{-1} (A^{-1})^{-1}\right)} _\text{\(= \ I_n\)} \hspace{0.03em} \times B \times \hspace{0.03em} \left(AB\right)^{-1} = B^{-1} A^{-1}$$
$$ \underbrace{ \left(B^{-1} (B^{-1})^{-1}\right)} _\text{\(= \ I_n\)} \ \times \ \left(AB\right)^{-1} = B^{-1} A^{-1}$$

So finally,

$$ \forall (A ,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2,$$
$$ A \text{ and } B \text{ are inversible } \Longrightarrow (A \times B) \text{ is inversible } \Longrightarrow \ \left(A \times B\right)^{-1} = B^{-1} \times A^{-1} $$
$$(9)$$

Both expressions \((9)\) and \((10)\) have the same behaviour:

$$ \forall (A ,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2, \enspace \Biggl \{ \begin{gather*} (A \times B)^T = B^T \times A^T \hspace{1em}\qquad (6) \\ \left(A \times B\right)^{-1} = B^{-1} \times A^{-1} \qquad (9) \end{gather*} $$

So, the order of transposition or inversion have no importance.

We deduce of it that:

$$ \forall (A ,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2,$$
$$ \left((A \times B)^T \right)^{-1} = \hspace{0.03em} \left((A \times B)^{-1} \right)^T = \hspace{0.03em} \left(A^T\right)^{-1} \times \hspace{0.04em} \left(B^T\right)^{-1} = \hspace{0.03em} \left(A^{-1}\right)^T \times \hspace{0.04em} \left(B^{-1}\right)^T $$

Traces of matrix

Linearity of the trace

Let \((A,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2\) be two squared matrices of the same size \(n\) and \((\lambda, \mu) \in \hspace{0.04em} \mathbb{R}^2\) two real numbers.

By the definition of the trace and by linear combination, we do have this:

$$Tr(\lambda A + \mu B) = \sum_{k = 1}^n (\lambda \ a + \mu \ b)_{k,k}$$

So, we directly obtain two sums:

$$Tr(\lambda A + \mu B) = \sum_{k = 1}^n \lambda \ a_{k,k} + \sum_{k = 1}^n b_{k,k}$$

And finally,

$$ \forall (\lambda \mu) \in \hspace{0.04em} \mathbb{R}^2, \ \forall (A,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2,$$
$$ Tr(\lambda A + \mu B) = \lambda \ Tr(A) + \mu \ Tr(B) $$

Trace of a product

Let \((A,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2\) be two squared matrices of the same size \(n\).

By the definition of the trace, we do have:

$$Tr( A \times B) = \sum_{k = 1}^n (a \times b)_{k,k}$$

Now, by the definition of the product matrix, we have it:

$$ \forall k \in [\![1, n]\!],$$
$$(a \times b)_{k,k} = \sum_{l = 1}^n a_{k,l} \times b_{l,k} $$

So, replacing it by its value in the previous expression:

$$Tr( A \times B) = \sum_{k = 1}^n \sum_{l = 1}^n \Bigl[ a_{k,l} \times b_{l,k} \Bigr] \qquad (10)$$

Variables \(k\) and \(l\) are free variables, therefore they can be switched:

$$Tr( A \times B) = \sum_{l = 1}^n \sum_{k = 1}^n \Bigl[ a_{l,k} \times b_{k,l} \Bigr] $$

The product of scalar on the filed \(\mathbb{K}\) being commutative, we can transform it into:

$$Tr( A \times B) = \sum_{l = 1}^n \sum_{k = 1}^n \Bigl[ b_{k,l} \times a_{l,k} \Bigr] $$

But, this is the same thing as \(Tr(B \times A)\), is we look at the expression \((10)\) and switching \(A\) and \(B\) positions.

As a result we do obtain,

$$ \forall (A,B) \in \hspace{0.03em} \mathcal{M}_{n}(\mathbb{K})^2,$$
$$ Tr(A \times B) = Tr(B \times A)$$

Recap table of the properties of matrices

Scroll top Retour en haut de page