Row Space, Column Space, and Null Space
students, when you work with matrices in linear algebra, you are really studying how a matrix acts like a machine 📦. It takes input vectors, transforms them, and produces output vectors. Three of the most important subspaces that describe this action are the row space, column space, and null space. These ideas help answer questions like: Which outputs are possible? Which inputs disappear? What information does a matrix keep?
In this lesson, you will learn how to identify these spaces, why they matter, and how they fit into the bigger picture of abstract vector spaces and subspaces.
What these spaces mean
Suppose $A$ is an $m \times n$ matrix. Then $A$ has rows, columns, and a special set of input vectors that get sent to zero.
- The row space of $A$ is the set of all linear combinations of the rows of $A$.
- The column space of $A$ is the set of all linear combinations of the columns of $A$.
- The null space of $A$ is the set of all vectors $x$ such that $Ax=0$.
These are all examples of subspaces. That means they are vector spaces sitting inside a bigger vector space. For instance, the row space is a subspace of $\mathbb{R}^n$, the column space is a subspace of $\mathbb{R}^m$, and the null space is a subspace of $\mathbb{R}^n$.
A good way to think about them is this:
- the row space tells you which directions in the input are important,
- the column space tells you which outputs are possible,
- the null space tells you which inputs get erased.
The row space: information from the rows
The rows of a matrix can be used to build a subspace. If the rows of $A$ are $r_1, r_2, \dots, r_m$, then the row space is
$$\text{Row}(A)=\text{span}\{r_1,r_2,\dots,r_m\}.$$
This means every vector in the row space is a linear combination of the rows.
Why is this useful? The row space contains the same information as the row equations in a linear system. If you solve $Ax=b$, the rows of $A$ give the equations that the variables must satisfy. After row reduction, the nonzero rows of the reduced matrix form a basis for the row space. That means row operations do not change the row space, even though they may change the individual rows.
Example
Let
$$A=\begin{bmatrix}1 & 2 & 3\\ 2 & 4 & 6\\ 1 & 1 & 1\end{bmatrix}.$$
The second row is $2$ times the first row, so it does not add a new direction. The row space is spanned by the independent rows, such as
$$\{(1,2,3),(1,1,1)\}.$$
So the row space is a subspace of $\mathbb{R}^3$ because each row has three entries.
The dimension of the row space is called the row rank. It equals the number of pivot rows after row reduction, and it matches the rank of the matrix.
The column space: all possible outputs
The column space is built from the columns of $A$. If the columns are $c_1, c_2, \dots, c_n$, then
$$\text{Col}(A)=\text{span}\{c_1,c_2,\dots,c_n\}.$$
This space matters because if you compute $Ax$, the result is always a linear combination of the columns of $A$. In fact, if $x=(x_1,x_2,\dots,x_n)^T$, then
$$Ax=x_1c_1+x_2c_2+\cdots+x_nc_n.$$
So the column space is the set of every output vector that can be produced by the matrix.
Example
Using the same matrix
$$A=\begin{bmatrix}1 & 2 & 3\\ 2 & 4 & 6\\ 1 & 1 & 1\end{bmatrix},$$
its columns are
$$c_1=\begin{bmatrix}1\\2\\1\end{bmatrix},\quad c_2=\begin{bmatrix}2\\4\\1\end{bmatrix},\quad c_3=\begin{bmatrix}3\\6\\1\end{bmatrix}.$$
Notice that $c_2$ is not a multiple of $c_1$ because the third entry does not match the same scaling. To find a basis for the column space, we row reduce $A$ and look at the pivot columns in the original matrix. If columns $1$ and $3$ are pivot columns, then a basis for the column space is
$$\left\{\begin{bmatrix}1\\2\\1\end{bmatrix},\begin{bmatrix}3\\6\\1\end{bmatrix}\right\}.$$
The dimension of the column space is called the column rank. It tells you how many independent output directions the matrix has.
A very important fact is that a system $Ax=b$ has a solution exactly when $b$ is in the column space of $A$. If $b$ is not in the column space, then the system has no solution.
The null space: inputs that disappear
The null space of $A$ is the set of all vectors $x$ such that
$$Ax=0.$$
This means the matrix sends those vectors to the zero vector. The null space is also called the kernel of the matrix.
If $x$ is in the null space, then $x$ is an input direction that gets completely erased. That makes the null space important for understanding whether a matrix loses information.
Example
Again let
$$A=\begin{bmatrix}1 & 2 & 3\\ 2 & 4 & 6\\ 1 & 1 & 1\end{bmatrix}.$$
To find the null space, solve
$$\begin{bmatrix}1 & 2 & 3\\ 2 & 4 & 6\\ 1 & 1 & 1\end{bmatrix}\begin{bmatrix}x_1\x_2\x_3\end{bmatrix}=\begin{bmatrix}0\\0\\0\end{bmatrix}.$$
Row reducing gives the equations
$$x_1+2x_2+3x_3=0,$$
$$-x_2-2x_3=0.$$
From the second equation, $x_2=-2x_3$. Substitute into the first:
$$x_1+2(-2x_3)+3x_3=0,$$
$$x_1-x_3=0,$$
so $x_1=x_3$.
Let $x_3=t$. Then
$$x=\begin{bmatrix}t\\-2t\t\end{bmatrix}=t\begin{bmatrix}1\\-2\\1\end{bmatrix}.$$
So the null space is
$$\text{Null}(A)=\text{span}\left\{\begin{bmatrix}1\\-2\\1\end{bmatrix}\right\}.$$
The dimension of the null space is called the nullity.
How row space, column space, and null space fit together
These three subspaces are connected by one of the most important relationships in linear algebra:
$$\text{rank}(A)+\text{nullity}(A)=n,$$
where $n$ is the number of columns of $A$.
This is called the Rank-Nullity Theorem. It says that the number of independent directions that survive in the output, plus the number of independent directions that get sent to zero, equals the number of input dimensions.
Another key fact is that the row space and column space have the same dimension:
$$\dim(\text{Row}(A))=\dim(\text{Col}(A)).$$
That common number is the rank of the matrix.
These relationships are not just algebra tricks. They tell you how much information a matrix keeps and how much it loses. For example, in image compression or data analysis, a matrix with low rank may represent data with repeated patterns. In network models, the null space can reveal hidden dependencies among variables. In geometry, the column space shows the directions that a transformation can reach.
Subspaces in the bigger picture
students, this lesson fits into the larger topic of abstract vector spaces and subspaces because row space, column space, and null space are all examples of vector spaces that satisfy the subspace rules:
- the zero vector is included,
- adding two vectors in the space keeps you inside the space,
- multiplying by a scalar keeps you inside the space.
These spaces may look different, but they follow the same structure. That is one of the powerful ideas in linear algebra: many different-looking problems share the same algebraic pattern.
For example:
- the row space lives in $\mathbb{R}^n$ because rows have $n$ entries,
- the column space lives in $\mathbb{R}^m$ because columns have $m$ entries,
- the null space lives in $\mathbb{R}^n$ because its vectors are inputs to the matrix.
When you study these spaces, you are not just solving equations. You are learning how to describe a matrix in terms of the subspaces it creates.
Conclusion
Row space, column space, and null space are three core ideas that reveal what a matrix does. The row space describes the independent row directions, the column space describes all possible outputs, and the null space describes all inputs that disappear to zero. Together, they connect matrix computation with the broader language of vector spaces and subspaces.
If you can identify these spaces, find bases for them, and understand their dimensions, you have a strong foundation for the rest of linear algebra. These ideas will keep showing up in systems of equations, transformations, rank, and many applications in science and technology 🚀.
Study Notes
- The row space of $A$ is the span of its rows and is a subspace of $\mathbb{R}^n$.
- The column space of $A$ is the span of its columns and is a subspace of $\mathbb{R}^m$.
- The null space of $A$ is the set of all vectors $x$ such that $Ax=0$.
- A basis for the row space comes from the nonzero rows of the row-reduced form of $A$.
- A basis for the column space comes from the pivot columns of the original matrix $A$.
- A basis for the null space comes from solving $Ax=0$.
- The dimension of the row space and column space is the rank of $A$.
- The dimension of the null space is the nullity of $A$.
- The Rank-Nullity Theorem is $\text{rank}(A)+\text{nullity}(A)=n$.
- A system $Ax=b$ is solvable exactly when $b$ is in the column space of $A$.
- Row space, column space, and null space are all examples of subspaces, so they connect matrix algebra to abstract vector spaces.
