MIT 18.06 by Glibert Strang

### §1 The Geometry of Linear Equations

\left\{\begin{aligned} 2x - y = 0 \\ -x + 2y = 3 \end{aligned} \tag{1.1} \right .

the matrix form

$\left[\begin{matrix}2 & -1\\-1 & 2\end{matrix}\right]\left[\begin{matrix}x\\y\end{matrix}\right]=\left[\begin{matrix}0\\3\end{matrix}\right]\tag{1.2}$

equation (1.4) is a Linear Combination

$x\left[\begin{matrix}2\\-1\end{matrix}\right] + y\left[\begin{matrix}-1\\2\end{matrix}\right] =\left[\begin{matrix}3\\0\end{matrix}\right]\tag{1.3}$

the euqtion (1.3) can be written as a normal form, x is a vector,

$A\textbf{x} = b\tag{1.4}$

we are supposed to solve the x while x is not always exits, here's an example

\begin{aligned} &x \left[\begin{matrix}2\\-1\\0\end{matrix}\right] + y\left[\begin{matrix}-1\\2\\0 \end{matrix}\right] + z\left[\begin{matrix}0\\-1\\0 \end{matrix}\right]\\ =& \left[\begin{matrix}0\\-1\\-4 \end{matrix}\right] \end{aligned}\tag{1.5}

In this case, $column 1$, $column 2$ and $column 3$ are in the same plane, then their combinations will lie in that same plane. so this would be a singular case, the martix would be not invertible. There would be no a solution for any $b$.

After, we can get a conclusion that for $A\textbf{x} = b$:
Ax is a combination of columns of A

\begin{aligned} \left[\begin{matrix}2 & 5 \\ 1 & 3 \end{matrix}\right]\left[\begin{matrix}1 \\ 2 \end{matrix}\right] &= 1 \times \left[\begin{matrix}2 \\ 1 \end{matrix}\right] + 2 \times \left[\begin{matrix}5 \\ 3 \end{matrix}\right]\\ &= \left[\begin{matrix}2 \\ 1 \end{matrix}\right] + \left[\begin{matrix}10 \\ 6 \end{matrix}\right]\\ &= \left[\begin{matrix}12 \\ 7 \end{matrix}\right] \end{aligned}\tag{1.6}

### §2 Elimination with Matrices

The key idea of Elimination is Matrix Operation

\left\{ \begin{aligned} x + 2y + z = 2\\ 3x + 8y + z = 12\\ 4y + z = 2 \end{aligned} \right . \tag{2.1}

Give the determinant form of the equation

$\left|\begin{matrix}{\color{red}1} & 2 & 1 \\ 3 & 8 & 1 \\ 0 & 4 & 1 \end{matrix}\right| \tag{2.2}$

The red 1 is called 1st pivot; then we row2 minus 3$\times$row1, get the follow determinant

$\left|\begin{matrix}1 & 2 & 1 \\ 0 & {\color{red}2} & -2 \\ 0 & 4 & 1 \end{matrix}\right| \tag{2.3}$

then row3 minus 2$\times$row2

$\left|\begin{matrix}1 & 2 & 1 \\ 0 & 2 & 0 \\ 0 & 0 & {\color{red}5} \end{matrix}\right| \tag{2.4}$

write the right hand of equation (2.1) into the matrix (2.2), it's called Augmented Matrix

$\left[\begin{matrix}1 & 2 & 1 & 2\\ 3 & 8 & 1 & 12\\ 0 & 4 & 1 & 2\end{matrix}\right] \tag{2.5}$

do the same elimination for matrix (2.5)

$\left[\begin{matrix}1 & 2 & 1 & 2\\ 0 & 2 & -2 & 6\\ 0 & 0 & 5 & -10\end{matrix}\right] \tag{2.6}$

#### The Rule of Matrix Multiplication

$Matrix \times column = column\\ row \times Matrix = row$

The reasult of multiplying a matrix by some vector is a linear combination of the columns of the matrix

\begin{aligned} & \left[\begin{matrix}{\color{red}A} & B & {\color{blue}C} \\ {\color{red}D} & E & {\color{blue}F} \\ {\color{red}G} & H & {\color{blue}I} \end{matrix}\right] \left[\begin{matrix}{\color{red}x} \\ y \\ {\color{blue}z}\end{matrix}\right]\\ =& {\color{red}x}\left[\begin{matrix}{\color{red}A} \\ {\color{red}D} \\ {\color{red}G}\end{matrix}\right] + {y}\left[\begin{matrix}{B} \\ {E} \\ {H}\end{matrix}\right] +{\color{blue}z} \left[\begin{matrix}{\color{blue}C} \\ {\color{blue}F} \\ {\color{blue}I}\end{matrix}\right] \end{aligned} \tag{2.7}