Hey everyone! This is part 7, and the final part of my linear algebra series! To catch you up, here are the last 6 posts:
The Invertible Matrix Theorem ties together all the ideas we’ve seen so far in this series, from systems of equations, bases, linear independence, determinants, kernels, and eigenvectors. By the end of this, you should have a pretty basic understanding of the main ideas and goals of linear algebra.
The Invertible Matrix Theorem
Let A be an nxn matrix. The following are equivalent:
A is invertible
The system of equations Ax=b has a unique solution x for every vector b
The columns of A are linearly independent
The rows of A are linearly independent
The columns of A form a basis for Rn
The columns of A form a basis for Rn
The determinant of A is non-zero
The image of A is Rn
The kernel of A is 0
All eigenvalues of A are non-zero
The Invertible Matrix Theorem specifically refers to square matrices (remember matrices can be rectangular too— with any number of rows and columns).
There are two types of square matrices: invertible and non-invertible. For an invertible matrix, all of the 10 facts above will be true. If a matrix is non-invertible, all 10 will be false.
I won’t go into a detailed proof of the theorem, but I’ll cover the basic intuition behind how each piece fits into the puzzle!
1-2 Invertible Matrices and Systems of Equations
First we have to define what it means for a matrix to be invertible. A matrix is invertible if it has an inverse matrix. For example, these matrices A and B are inverses of each other. A is the matrix which represents a horizontal stretch by 2, and B is a horizontal stretch by 0.5, which is like squishing the x-axis. They undo each other!
More precisely, they’re inverses of each other because multiplying them together yields the identity matrix.
Recall that matrices represent systems of linear equations (we covered this more in part 4: What Is A Matrix?), which can be expressed like this:
A huge part of linear algebra is concerned with finding out whether the system Ax=b has a unique solution. If A is invertible, then that means it’s possible to “undo” A and solve for x.
And vice versa, if it’s possible to solve Ax=b with a unique solution for x given every possible vector b, then A must be invertible!
3-4 Systems of Equations and Linear Independence
We’ve seen that systems of linear equations can be represented geometrically as lines on a plane, or planes in space, or hyperplanes in hyperspace depending on the dimension you’re in! To make it easier to visualize, let’s stay in 2-dimensional space.
When the rows of A are not linearly independent (i.e. they are scalar multiples of each other), this is analogous to lines on a plane which have the same slope, and are thus parallel. This means there will be no unique intersection point—either the lines are parallel and never intersect, or they’re same exact line and they have infinitely many intersection points.
So the existence of a unique solution to Ax=b is equivalent to the rows and columns of A being linearly independent, since we have to insist on each equation having a different slope!
5-6 Bases and Linear Independence
In my post on basis vectors, we saw that a basis was a linearly independent spanning set for a vector space. If A has n columns which are linearly independent, then they must span Rn, which means they fit the definition of a basis for Rn! And the same goes for the rows.
7 The Determinant
In my post on determinants (Determinants and The Rank-Nullity Theorem), we saw how a matrix has a determinant of 0 when it flattens space down by a dimension (or more!), and this happens when the columns and rows of the matrix are not linearly independent. That means that a square matrix with linearly independent columns and rows must have a determinant which is not zero!
8-9 The Image and Kernel
If Ax=b has a solution for every vector b, then matrix multiplication by A is capable of spanning the entire vector space, Rn. This is equivalent to the columns of A forming a basis for Rn. All of this is the same as saying the image of A is Rn. The dimension of the image is thus the same as the dimension of Rn, which is n. Another word for the dimension of the image of A is the rank!
By the rank-nullity theorem, it follows that the kernel of A must have dimension 0. In particular, the kernel of A is a single point, which is the zero vector.
Remember, the kernel is the set of vectors which A maps to 0. If A is invertible, then it must only map 0 to 0.
(I’m using 0 here interchangeably with the zero-vector, which is an n-dimensional vector where all the entries are 0)
10 The Eigenvalues
Our last post was a brief introduction to eigenvectors and eigenvalues. We saw that an eigenvector is, by definition, a non-zero vector v which obeys:
Meaning that A maps v to a scalar multiple of itself, and that scalar multiple (lambda) is called an eigenvalue of A.
The invertible matrix theorem says that an invertible matrix will have non-zero eigenvalues. Why?
Remember that the kernel of A must be 0, so A only maps the zero vector to the zero vector, and nothing else. If A had an eigenvalue of 0, that means there’s some corresponding eigenvector (which by definition is non-zero) which A also maps to 0, which contradicts our assumption that the kernel is trivial! Thus the eigenvalues of A must all be non-zero.
Closing Thoughts
Thanks for following along on this little series of posts about linear algebra! I know there’s LOTS more linear algebra topics I didn’t cover here, and by no means is this meant to be a comprehensive or rigourous guide to this subject.
To be honest, I started this series as a way to streamline my own studying process since I’ve been reviewing my past university math classes, and I thought that converting my scribbled notes into a blog/video series would help me summarize my own thoughts (and as an added bonus, be helpful to others).
Let me know if you have any questions or comments, or ideas for future series! I plan to also write about calculus, real & complex analysis, differential geometry, number theory, set theory, and more, as I continue to self-study math!