MIT A 2020 Vision of Linear Algebra (1/2)

Published: by Creative Commons Licence

\[\begin{matrix} A=CR=\begin{bmatrix} ~ \\ ~ \end{bmatrix}\begin{bmatrix} ~ & ~ \end{bmatrix} & A=LU\\ A=QR=\begin{bmatrix} q_1 & q_n \end{bmatrix}\begin{bmatrix} ~&~\\0&~ \end{bmatrix} & A=Q\Lambda Q^T(Q^T=Q^{-1})\\ A=X\Lambda X^{-1}&A=U\Sigma V^T \end{matrix}\]

Part 1: the Column Space of a Matrix

Rotation matrix

\[\begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix}\]

is an orthogonal matrix that rotates the plane.

\[\begin{aligned} Ax&=\begin{bmatrix} 1&4&5\\3&2&5\\2&1&3 \end{bmatrix}\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}=\begin{bmatrix} 1\\3\\2 \end{bmatrix}x_1+\begin{bmatrix} 4\\2\\1 \end{bmatrix}x_2+\begin{bmatrix} 5\\5\\3 \end{bmatrix}x_3\\ &=\textit{linear combination of columns of }A \end{aligned}\] \[\begin{aligned} \textbf{Column space of A}&= \textbf{C}(A) = \textit{all vectors } Ax\\&=\textit{all linear combinations of the columns}\\&=\textit{a plane} \end{aligned}\]

We include the first 2 columns, but we DO NOT KEEP COLUMN 3 because it is just the sum of the first 2 and it's on the plane, nothing new. So the real meat of the matrix $A$ is in the column matrix $C$ that has just two columns.

The $(5,5,3)^T$ would be called a dependent vector because it depends on the first two, those were independent.

\[\textit{COLUMN 3}=\textit{COLUMN1+COLUMN2}\\ \textit{hence, }A=CR=\begin{bmatrix} 1&4\\3&2\\2&1 \end{bmatrix}\begin{bmatrix} 1&0&1\\0&1&1 \end{bmatrix}\\ \textbf{Row rank}=\textbf{column rank}=r=2\]

The rows of $R$ are a basis for the row space.

these are the key ideas!!!

$A=CR$ shows the column rank of $A$ = row rank of A.

  1. The $r$ columns of $C$ are a basis for the column space of A: dimension $r$;
  2. The $r$ rows of $R$ are a basis for the row space of A: dimension $r$.

Counting Theorem. If $A$ has rank $r$, there are $n-r$ independent solutions to $Ax=0$.

Matrix A with rank 1. If all columns of A are multiples of column 1, show that all rows of A are multiples of one row.

If a matrix is ill-conditioned means they are difficult to deal with.

If $A$ is invertible then $C=A$ and $R=I$: no progress $A=AI$.

Part 2: The Big Picture of Linear Algebra

$Ax=0$

If $Ax=0$ then

\[\begin{bmatrix} row~1\\:\\ row~m \end{bmatrix}\begin{bmatrix} ~\\ x\\~ \end{bmatrix}=\begin{bmatrix} 0\\:\\0 \end{bmatrix}\]

and we can know that $x$ is orthogonal to every row of $A$.

\[\begin{bmatrix} ~\\ x\\~ \end{bmatrix}\]

means that $x$ is a column of numbers.

If a row dot product with a column, gives me a zero, then in n-dimensional space, that row is perpendicular, 90 degree angle to that column $x$.

  • Every $x$ in the nullspace of $A$ is orthogonal to the row space of $A$.
  • Every $y$ in the nullspace of $A^T$ is orthogonal to the column space of $A$.

These are the two pairs of orthogonal subspaces. The dimensions add to $n$ and to $m$.

  $N(A)\perp C(A^T)$ $N(A^T)\perp C(A)$
Dimensions n-r and r m-r and r

big picture of LA

This is the Big Picture —- two subspaces in $R^n$ and two subspaces in $R^m$.

from row space to column space and $A$ is invertible.

Multiplying Columns times Rows / Six Factorizations

$A=BC=\text{sum of rank-1 matrices} (\textbf{column times row: outer product})$

\[BC=\begin{bmatrix} \vert&\vert&&\vert\\ b_1&b_2&··&b_n\\ \vert&\vert&&\vert \end{bmatrix}\begin{bmatrix} -&c_1^*&-\\ -&c_2^*&-\\ &:&\\ -&c_n^*&-\\ \end{bmatrix}=b_1c_1^*+b_2c_2^*+···+b_nc_n^*\]

A new way to multiply matrices! High level! Row-column is low level!


If you wanna see the vedio, click MIT A 2020 Vision of Linear Algebra, Spring 2020.