Identifying Important Subspaces of a Matrix
Imagine a matrix as a machine that takes in vectors and outputs new vectors ⚙️. students, one of the biggest ideas in linear algebra is that this machine has a few special subspaces attached to it. These subspaces tell us what the matrix can do, what it cannot do, and how information moves through the transformation. In this lesson, you will learn how to identify the most important subspaces connected to a matrix and why they matter in abstract vector spaces and subspaces.
Learning Goals
By the end of this lesson, students, you should be able to:
- Explain the key ideas and vocabulary for important subspaces of a matrix.
- Identify the four main subspaces associated with a matrix.
- Use matrix reasoning to find or describe these subspaces.
- Connect these subspaces to the bigger picture of vector spaces and subspaces.
- Recognize how examples of these subspaces appear in real linear algebra problems.
Why Subspaces Matter
A matrix does more than store numbers. It represents a linear transformation, which means it preserves addition and scalar multiplication. The most important subspaces describe where the transformation starts, where it ends up, and what information is lost along the way.
The key idea is that a matrix $A$ naturally creates four major subspaces:
- the column space of $A$
- the null space of $A$
- the row space of $A$
- the left null space of $A$
Each one reveals a different part of the matrix’s behavior. Think of a school cafeteria machine that sorts trays 🍽️. One part shows what gets accepted, another shows what gets rejected, and another shows the patterns in the input and output. These subspaces play similar roles for a matrix.
A subspace is a set of vectors that stays closed under vector addition and scalar multiplication, and contains the zero vector. This is why these matrix-related sets are so important: they are not random collections of vectors, but structured spaces.
The Column Space: Where the Matrix Sends Vectors
The column space of a matrix $A$ is the set of all linear combinations of the columns of $A$. In symbols, if the columns of $A$ are $\mathbf{a}_1, \mathbf{a}_2, \dots, \mathbf{a}_n$, then
$$\text{Col}(A)=\text{span}\{\mathbf{a}_1,\mathbf{a}_2,\dots,\mathbf{a}_n\}$$
This subspace tells you all possible outputs of the transformation $\mathbf{x}\mapsto A\mathbf{x}$. If a vector $\mathbf{b}$ is in the column space, then the equation
$$A\mathbf{x}=\mathbf{b}$$
has at least one solution.
Example
Suppose
$$A=\begin{bmatrix}1&2\\3&6\end{bmatrix}$$
The second column is $2$ times the first column, so the column space is just the span of the first column:
$$\text{Col}(A)=\text{span}\left\{\begin{bmatrix}1\\3\end{bmatrix}\right\}$$
That means every output lies on the same line through the origin in $\mathbb{R}^2$. The matrix cannot produce every vector in $\mathbb{R}^2$ because its columns do not point in enough independent directions.
How to identify it
To find a basis for the column space, use the pivot columns of the original matrix. After row reducing, the pivot positions show which columns are independent. Important detail: the basis vectors come from the original matrix, not the row-reduced matrix.
The Null Space: What Gets Sent to Zero
The null space of a matrix $A$ is the set of all vectors $\mathbf{x}$ such that
$$A\mathbf{x}=\mathbf{0}$$
This subspace tells us which inputs are completely lost by the transformation. If two different inputs differ by a vector in the null space, the matrix sends them to the same output.
Why it matters
The null space measures failure of uniqueness. If the null space contains only the zero vector, then $A\mathbf{x}=\mathbf{0}$ has only the trivial solution, and the columns of $A$ are linearly independent.
Example
Let
$$A=\begin{bmatrix}1&2\\3&6\end{bmatrix}$$
Solve
$$\begin{bmatrix}1&2\\3&6\end{bmatrix}\begin{bmatrix}x_1\x_2\end{bmatrix}=\begin{bmatrix}0\\0\end{bmatrix}$$
This gives $x_1+2x_2=0$, so $x_1=-2x_2$. Let $x_2=t$. Then
$$\mathbf{x}=\begin{bmatrix}-2t\t\end{bmatrix}=t\begin{bmatrix}-2\\1\end{bmatrix}$$
So
$$\text{Nul}(A)=\text{span}\left\{\begin{bmatrix}-2\\1\end{bmatrix}\right\}$$
This means the matrix crushes every vector on that line down to zero.
How to identify it
To find the null space, solve the homogeneous system $A\mathbf{x}=\mathbf{0}$ using row reduction. The free variables produce a parametric description of the solution space, and the resulting direction vectors form a basis for the null space.
The Row Space: Information in the Equations
The row space of a matrix is the span of its rows. If the rows of $A$ are $\mathbf{r}_1, \mathbf{r}_2, \dots, \mathbf{r}_m$, then
$$\text{Row}(A)=\text{span}\{\mathbf{r}_1,\mathbf{r}_2,\dots,\mathbf{r}_m\}$$
The row space is important because it captures the independent equations in the matrix system. It also tells us the same rank as the column space, even though the row space lives in a different vector space.
Example
For
$$A=\begin{bmatrix}1&2\\3&6\end{bmatrix}$$
the second row is $3$ times the first row, so
$$\text{Row}(A)=\text{span}\left\{\begin{bmatrix}1&2\end{bmatrix}\right\}$$
Row operations do not change the row space. In practice, the nonzero rows of the reduced row echelon form give a basis for the row space.
Why the row space is useful
The row space tells us which equations are independent. If a row is a combination of other rows, it does not add new information. This helps explain why some systems have redundant equations.
The Left Null Space: What Is Orthogonal to the Rows
The left null space of a matrix $A$ is the null space of the transpose $A^T$. It consists of vectors $\mathbf{y}$ such that
$$A^T\mathbf{y}=\mathbf{0}$$
Equivalently, if $A$ is an $m\times n$ matrix, then the left null space is a subspace of $\mathbb{R}^m$.
This space contains vectors orthogonal to every row of $A$. It measures dependencies among the rows, just as the null space measures dependencies among the columns.
Example
For the same matrix
$$A=\begin{bmatrix}1&2\\3&6\end{bmatrix}$$
we have
$$A^T=\begin{bmatrix}1&3\\2&6\end{bmatrix}$$
Solve
$$A^T\mathbf{y}=\mathbf{0}$$
This gives a one-dimensional left null space. Because the rows of $A$ are dependent, there is a nonzero vector orthogonal to both rows.
Why it matters
The left null space helps explain which vectors in the output space are unreachable by the transpose, and it connects directly to orthogonality and consistency conditions in linear systems.
The Big Picture: How the Four Subspaces Fit Together
The four subspaces are not isolated ideas. They form a powerful framework for understanding a matrix.
If $A$ is an $m\times n$ matrix:
- $\text{Col}(A)$ is a subspace of $\mathbb{R}^m$
- $\text{Nul}(A)$ is a subspace of $\mathbb{R}^n$
- $\text{Row}(A)$ is a subspace of $\mathbb{R}^n$
- $\text{Nul}(A^T)$ is a subspace of $\mathbb{R}^m$
Notice something important: the column space and left null space live in $\mathbb{R}^m$, while the row space and null space live in $\mathbb{R}^n$. These pairs are connected by orthogonality.
A major fact called the Fundamental Theorem of Linear Algebra ties these spaces together. It says, in part, that:
- $\text{Col}(A)$ is orthogonal to $\text{Nul}(A^T)$
- $\text{Row}(A)$ is orthogonal to $\text{Nul}(A)$
This means each space has a partner that points in a perpendicular direction. That orthogonal structure helps explain rank, dimension, and solvability.
The rank of $A$ is the dimension of the column space, the row space, and the number of pivot columns:
$$\text{rank}(A)=\dim\big(\text{Col}(A)\big)=\dim\big(\text{Row}(A)\big)$$
The rank and nullity are linked by the rank-nullity theorem:
$$\text{rank}(A)+\text{nullity}(A)=n$$
where $n$ is the number of columns of $A$.
How to Identify These Subspaces in Practice
When given a matrix, students, here is a useful strategy:
- Row reduce the matrix.
- Find the pivot columns.
- Use the pivot columns of the original matrix to build a basis for $\text{Col}(A)$.
- Use the nonzero rows of the reduced matrix to build a basis for $\text{Row}(A)$.
- Solve $A\mathbf{x}=\mathbf{0}$ to find $\text{Nul}(A)$.
- Solve $A^T\mathbf{y}=\mathbf{0}$ to find $\text{Nul}(A^T)$.
Real-world connection
In data science, a matrix may represent relationships between features. The column space shows which data outputs can actually be generated. The null space shows which combinations of inputs produce no change in the output. In engineering, this can describe systems where certain forces cancel out, or where some measurements are redundant 🔍.
Conclusion
Important subspaces of a matrix give a complete picture of how a linear transformation behaves. The column space describes reachable outputs, the null space describes invisible inputs, the row space describes independent equations, and the left null space describes dependencies among rows. Together, these subspaces connect concrete matrix calculations to abstract vector space ideas.
students, understanding these spaces helps you see beyond arithmetic and into the structure of linear algebra. Once you can identify them, you can better solve systems, interpret transformations, and understand why matrices work the way they do.
Study Notes
- A subspace is closed under addition and scalar multiplication and contains the zero vector.
- The four important subspaces of a matrix $A$ are $\text{Col}(A)$, $\text{Nul}(A)$, $\text{Row}(A)$, and $\text{Nul}(A^T)$.
- $\text{Col}(A)$ is the span of the columns of $A$.
- $\text{Nul}(A)$ is the set of all vectors satisfying $A\mathbf{x}=\mathbf{0}$.
- $\text{Row}(A)$ is the span of the rows of $A$.
- $\text{Nul}(A^T)$ is the left null space of $A$.
- Pivot columns of the original matrix give a basis for the column space.
- Nonzero rows of the reduced row echelon form give a basis for the row space.
- Solve $A\mathbf{x}=\mathbf{0}$ to find the null space.
- Solve $A^T\mathbf{y}=\mathbf{0}$ to find the left null space.
- The rank of a matrix is the dimension of its column space and row space.
- The rank-nullity theorem is $\text{rank}(A)+\text{nullity}(A)=n$.
- The column space and left null space are orthogonal.
- The row space and null space are orthogonal.
- These subspaces explain whether a system has solutions, unique solutions, or redundant information.
