@gentlemandude1

I wish someone would explain why linear algebra instructors never motivate the math techniques (re: algorithms) that they teach. Linear algebra is always presented as a set of "recipes" to follow. But students never know whether they're baking a pie, batch of cookies or a cake. This video has provided me with more insight than the semester-long course on Vectors and Matrices that I took in university. It's a shame that linear algebra is taught so poorly. It's such an important topic.

@TheCosmicafroninja

Adding this visual element is a great idea for helping students to grasp the more intense abstract concepts of mathematics. I felt like I had an okay understanding of linear algebra after taking a class in it but this really helps to solidify my understanding.

@MinhTran-wn1ri

@0:26 Think of an mxn matrix as a set of m row vectors (each with n elements) and a set of n column vectors (each with m elements). 
@0:59 Matrix multiplication by a vector can be thought of as adding scaled column vectors together. The elements of the input vector tell how much to scale each column vector -- the first element tells how much to scale the first column vector, etc. The result of the multiplication is the vector you get having added the scaled column vectors together. 
@1:17-1:33 A system of linear equations (which can be rewritten as matrix multiplication) can also be thought of as an intersection of planes. The output vector (the result of the matrix multiplication) determines where the planes of the equations lie. The point of intersection of those planes rerepresent the input vector. Intersection need not be a point, it can be a line or plane etc.
@2:06 Recap: A system of equations, which can be represented as matrix multiplication, can be thought of as intersecting planes or the sum of scaled column vectors. Intersecting planes help you solve for the input vector space (i.e., the set of all input vectors that makes the system equations equal). Sum of scaled column vectors help you visualize the image of the linear transformation (i.e., a mapping from the set of input vectors, the domain, to the image in the codomain).
@2:22 The set of all input vectors in the domain that map to the zero vector is called the nullspace (aka. kernel) of the linear transformation. It usually includes (0,0,0), the origin but the kernel could be a line, a plane, etc. @3:12 Gaussian elimination algorithm simplifies the system of equations to give you the kernel. @3:23 In gaussian elimination algorithm, when you multiply an equation by a constant, the plane changes shape but the part of the plane that is also a part of the kernel of the system of equations does not change. @4:13 If any two equations can be 'rotated onto' one another (forming a single indistinguishable plane), there is a 'free variable' which means the kernel space has moved up a dimension (i.e., a point to a line, a line to a plane, etc.). @4:55 The number of dependent variables are called pivots, the number of free variables indicate the dimension of the kernel (e.g., 1 free variable means the kernel has 1 dimension or in other words it has the shape of a line). 
@5:26 A system of equations can be thought of as taking the inner product (i.e., dot product) of the input vector and each row vector. @5:53 when looking for the kernel (i.e., where the output vector is the zero vector), each equation in the system of equations is a constraint that says the kernel vector is perpendicular to the row vector. @6:10-6:44 The row vectors of the multiplication matrix span (i.e., the image of all linear combinations of the row vectors) a subspace that is perpendicular to the kernel.
@6:45-7:10 Each vector in the kernel contains elements that are scalars for the column vectors (of the transformation matrix) such that the scaled column vectors sum to the zero vector (i.e., the sum of the scaled vectors, put end to end, points back to the origin). 
@7:10-7:25 Linearly dependent vectors. @7:25-7:51 Column space. 
@8:11-8:22 The column space and the row space are always the same dimension.
@8:55-end Applications of matrix multiplication.

@trumanhw

My god, thank you.  It always seemed SO DAMNED ODD when we learned matrices, because in fact, without explaining their contextual utility, it's like teaching about nouns, without telling someone that ... YOU'RE NOT WORKING ON SENTENCES right now -- just nouns. So don't be surprised when nothing sensible occurs from the concept -- because we're not thinking a complete thought ... but rather, accepting that things which look like this can be manipulated in basic basic ways ... in which we'll learn more relevant rules to LATER.

@andrewharley6791

This just made my entire semester of Linear Algebra make a whole lot more sense.

@i_g6676

There is a problem with visualizations from 2:45 and so on. The linear equations which are homogeneous(i.e. have 0 on the right) always correspond to the planes WHICH COME THROUGH THE ORIGIN (because all-zeros vector satisfies such equations),  and it is not the case in the video.
Similarly, at 6:36 row space and null space must always contain the origin as they are LINEAR SPACES, not AFFINE SPACES

@detonation79

Watching this on the evening before my linear algebra midterm has replenished my motivation!

@soy-dave

Great video! I wish this was introduced in my linear algebra class. It would have solidified the notion of "why" we were even doing Gaussian elimination in the first place as well as understanding the effect of what row reduced echelon form looks like. Keep it going!

@spacecase4062

It’s because of you that my interest in math continues to grow daily

@natidadon

Amazing!!
I just learned linear algebra at the University and yet I learned a few things from the video

@douglasstrother6584

As one gets further into mathematics and its applications, most problems boil-down to "Find the inverse of the matrix A." or "Compute the eigenvalues of the matrix A.", etc.

@smrtfasizmu6161

Because I watched 3blue1brown I think I know why at 2:50 the last plane intersects the other planes over a line and not over a single point. One thing that Zack didn't mention is that determinant of this matrix is zero, but that just means one of those row vectors or column vectors is linearly dependent on other vectors. Determinant is 0 when you lose some degree of freedom. If you look at 3 by 3 matrix as 3 unit vectors these 3 unit vectors can usually cover the entire 3d space. You can reach any point in 3d space by adding those unit vectors. However, if a unit vector is a linear combination of another two unit vectors, that means that that unit vector "doesn't add anything to the table" . You can always use the linear combination of the other two unit vectors instead of the third vector. Which means that 2 unit vectors can cover as much space as 3 unit vectors can. The 3rd vector is kind of useless in that sense. Every point in space that you can reach with that unit vector, you can reach without that unit vector as well. So, instead of being able to cover 3d space, matrix only covers 2d space, because one of the unit vectors is useless at covering space (the other two unit vectors are just as good at covering space without the third vector. Because, if a vector is a linear combination of the other two unit vectors, you can always use the linear combination of the other unit vectors instead of that third unit vectors. The third unit vector is useless). Anyways, I was going to leave a comment about why the intersection at 2:50 is a line and not a point looking at the problem from another angle. If one of the rows or columns is linearly dependent on the other two, it means that you can eventually get rid of one row or a column which will leave you with 2 equations and 3 unknown variables. 2 equations will help you get rid of one variable, so you are left with 1 equation and 2 unknown variables. And that's a graph of a line. Also, you have 1 degree of freedom you can set one variable to bewhatever you want but then the value for the other variable is fixed.

@matattz

Math can be so simple yet complicated at the same time. Once you visualize it, all makes perfect sense and you wonder why you didn’t grasp it sooner. Looking in your textbooks without these visual insights can be a really terrifying experience!

@kevinbyrne4538

I was aware of the application of graph theory to electrical circuits -- Ernst Guillemin (1953) "Introductory Circuit Theory" and Wikipedia:  Topology (electrical circuits) -- but in just one minute, the relations between graphs, loops, and trees are clarified.
Beautiful job, sir.  Thank you for posting this video.

@maxgibbard8536

I am amazed at how quickly this got complicated, yet it stayed digestible. The visual graphics complement the numbers and the vocals exquisitely. Great video!

@boluwarin

You need to start a school. Everyone would sign up

@mehrosenasir3966

the one thing that amazed me is when we scale a linear equation in Gauss Jordan elimination the point of intersection still remains the same. Just wow!!

@ZyTelevan

The geometric interpretation can be useful in the context of 2D and 3D graphics, but I find it even more confusing when talking about higher dimensions. There is another interpretation that is used in signal processing and that I find is much more intuitive for a large number of dimensions. You can find the full explanation in the first few lectures on Dynamic Linear Systems from Stanford, by prof. Stephen Boyd. This is particularly useful in the context of neural networks. It goes something like this:

Suppose you have a linear system with a n-dimensional input and a m-dimensional output. Then this system can be fully described by a m by n matrix A . The entry A_i,j (i-th row, j-th column) just tells you how much the i-th output is affected by the j-th input. The corresponding equation is Ax=b , where x is a n-dimensional column vector and b is a m-dimensional column vector. But what if you have an equation AX=B , where X and B are also matrices? Well, X is just a series of n-dimensional inputs (each represented by a column vector) and B is a corresponding series of m-dimensional outputs (also each represented by the corresponding column vector). Here it's also immediately obvious that X and B have to have the same number of column vectors.

@somecomposingfudsa

I'm taking Linear Algebra right now, and this has really helped me visualize everything I've learned so far (up to Eigenvalues, Eigenvectors, and Matrix Diagonalization), so thank you so much!

@MrJaksld

Thank you so much. I am literally taking linear algebra and was very confused by the null space. This video really helped especially the visualization