7. Span, Linear Independence, Basis, and Dimension

Linear Dependence And Independence

Linear Dependence and Independence

Welcome, students! 👋 In this lesson, you will learn one of the most important ideas in linear algebra: how to tell when vectors are dependent or independent. These ideas help us understand whether a set of vectors has duplicate information, how they build a space, and why some vectors can act as a good foundation for all the others.

What You Will Learn

By the end of this lesson, you should be able to:

  • Explain the meaning of linear dependence and linear independence.
  • Test whether vectors are dependent or independent.
  • Understand how these ideas connect to span, basis, and dimension.
  • Use examples and reasoning to decide whether a set of vectors contains extra or unnecessary information.

Think of vectors like tools in a toolbox 🧰. If one tool can do everything another tool can do, then having both may be redundant. Linear dependence is the mathematical version of that idea.

What Does Linear Independence Mean?

A set of vectors is linearly independent if none of the vectors can be written as a linear combination of the others. In simpler words, every vector in the set adds something new.

For vectors $v_1, v_2, \dots, v_n$, the set is linearly independent if the only solution to

$$c_1v_1 + c_2v_2 + \cdots + c_nv_n = \mathbf{0}$$

is

$$c_1 = c_2 = \cdots = c_n = 0.$$

This means you cannot make the zero vector using the vectors unless every coefficient is zero. If the only way to balance the equation is to use no “real” combination at all, the vectors are independent.

Example: Independent Vectors in $\mathbb{R}^2$

Consider

$$v_1 = \begin{bmatrix}1\\0\end{bmatrix}, \quad v_2 = \begin{bmatrix}0\\1\end{bmatrix}.$$

These are the standard basis vectors in the plane. Neither one is a multiple of the other, and neither one can be built from the other alone. So they are linearly independent.

If you combine them as

$$c_1\begin{bmatrix}1\\0\end{bmatrix} + c_2\begin{bmatrix}0\\1\end{bmatrix} = \begin{bmatrix}0\\0\end{bmatrix},$$

you get

$$\begin{bmatrix}c_1\c_2\end{bmatrix} = \begin{bmatrix}0\\0\end{bmatrix},$$

which forces $c_1 = 0$ and $c_2 = 0$.

What Does Linear Dependence Mean?

A set of vectors is linearly dependent if at least one vector in the set can be written as a linear combination of the others.

This is the same as saying there is a nontrivial solution to

$$c_1v_1 + c_2v_2 + \cdots + c_nv_n = \mathbf{0},$$

where at least one coefficient is not zero.

That nonzero coefficient shows that one vector is not adding anything completely new. It can be “built” from the others.

Example: Dependent Vectors

Let

$$v_1 = \begin{bmatrix}1\\2\end{bmatrix}, \quad v_2 = \begin{bmatrix}2\\4\end{bmatrix}.$$

Notice that

$$v_2 = 2v_1.$$

So $v_2$ is just a scaled version of $v_1$. These vectors point in the same direction, which means they do not provide two independent directions in the plane. The set is linearly dependent.

We can also show this by finding a nontrivial combination that gives the zero vector:

$$2v_1 - v_2 = \mathbf{0}.$$

Since the coefficients $2$ and $-1$ are not both zero, the set is dependent.

How to Test for Dependence or Independence

There are a few common ways to decide whether vectors are dependent or independent. The best method depends on the problem.

1. Look for a Simple Relationship

If one vector is a multiple of another, the set is dependent.

For example,

$$\begin{bmatrix}3\\6\end{bmatrix} = 3\begin{bmatrix}1\\2\end{bmatrix}.$$

So these two vectors are dependent.

2. Solve a Vector Equation

Set up the equation

$$c_1v_1 + c_2v_2 + \cdots + c_nv_n = \mathbf{0}$$

and see whether the only solution is the trivial one.

For vectors in matrix form, place them as columns in a matrix and solve the homogeneous system

$$A\mathbf{x} = \mathbf{0}.$$

If the only solution is $\mathbf{x} = \mathbf{0}$, the columns are independent. If there is a free variable or a nontrivial solution, they are dependent.

3. Use Geometry in Low Dimensions

In $\mathbb{R}^2$, two vectors are independent if they are not on the same line through the origin.

In $\mathbb{R}^3$, three vectors are independent if they do not all lie in the same plane through the origin.

This geometric view is helpful because it gives a visual picture of “new directions.” 🌍

Why Dependence Matters

Linear dependence tells us that a vector is not needed as an extra building block. This matters because linear algebra is often about finding the smallest set of vectors that still captures everything we need.

Real-World Example

Imagine you are giving directions in a city. If one street route can be described using a combination of other routes, then it does not give new navigation information. In the same way, a dependent vector does not add a new direction to the space.

Another Example: Repeated Information

Suppose data from one sensor is always exactly twice the data from another sensor. Then one sensor is redundant for the purpose of describing the system. In vector language, one measurement vector depends on the other.

Connection to Span

The span of a set of vectors is the set of all linear combinations you can make from them.

For vectors $v_1, v_2, \dots, v_n$, the span is

$$\text{Span}\{v_1, v_2, \dots, v_n\} = \{c_1v_1 + c_2v_2 + \cdots + c_nv_n : c_1, c_2, \dots, c_n \in \mathbb{R}\}.$$

Linear dependence and span are closely connected.

  • If vectors are dependent, then at least one vector is unnecessary for the span.
  • If vectors are independent, each vector contributes a new direction to the span.

For example, in $\mathbb{R}^2$, the vectors

$$\begin{bmatrix}1\\0\end{bmatrix} \quad \text{and} \quad \begin{bmatrix}0\\1\end{bmatrix}$$

span all of $\mathbb{R}^2$ and are independent. But if you add

$$\begin{bmatrix}1\\1\end{bmatrix},$$

then the new vector is already in the span of the first two, so the three-vector set becomes dependent.

Connection to Basis and Dimension

A basis is a set of vectors that is both:

  • linearly independent, and
  • spans the whole space.

So a basis is a “just right” set of vectors: no duplicates, no missing directions.

For example, in $\mathbb{R}^2$, the set

$$\left\{\begin{bmatrix}1\\0\end{bmatrix}, \begin{bmatrix}0\\1\end{bmatrix}\right\}$$

is a basis because it is independent and spans the entire plane.

The dimension of a vector space is the number of vectors in any basis for that space. In $\mathbb{R}^2$, the dimension is $2$. In $\mathbb{R}^3$, the dimension is $3$.

This tells us something powerful: if a set has more vectors than the dimension of the space, it must be dependent. For example, any set of $4$ vectors in $\mathbb{R}^3$ is automatically dependent because $\mathbb{R}^3$ only has dimension $3$.

Common Mistakes to Avoid

A few ideas often cause confusion:

  • A vector being nonzero does not automatically make a whole set independent.
  • Vectors can be dependent even if none of them is the zero vector.
  • A set can span a space and still be dependent.
  • A set can be independent but not span the whole space.

For example, in $\mathbb{R}^3$, the vectors

$$\begin{bmatrix}1\\0\\0\end{bmatrix}, \quad \begin{bmatrix}0\\1\\0\end{bmatrix}$$

are independent, but they do not span all of $\mathbb{R}^3$ because they cannot produce vectors with a nonzero third component.

Quick Summary Example

Let

$$v_1 = \begin{bmatrix}1\\0\\1\end{bmatrix}, \quad v_2 = \begin{bmatrix}2\\1\\3\end{bmatrix}, \quad v_3 = \begin{bmatrix}3\\1\\4\end{bmatrix}.$$

Check whether $v_3$ can be made from $v_1$ and $v_2$:

$$v_1 + v_2 = \begin{bmatrix}1\\0\\1\end{bmatrix} + \begin{bmatrix}2\\1\\3\end{bmatrix} = \begin{bmatrix}3\\1\\4\end{bmatrix} = v_3.$$

So

$$v_3 = v_1 + v_2,$$

which means the set is linearly dependent. One vector is redundant.

Conclusion

students, linear independence means every vector in a set contributes something new, while linear dependence means at least one vector can be built from the others. These ideas help us understand when vectors are useful building blocks and when some are unnecessary.

They also connect directly to the big ideas of linear algebra: the span of a set, the structure of a basis, and the dimension of a space. If you can tell whether a set is dependent or independent, you are already thinking like a linear algebra expert 🧠✨

Study Notes

  • A set of vectors is linearly independent if the equation $c_1v_1 + c_2v_2 + \cdots + c_nv_n = \mathbf{0}$ has only the trivial solution $c_1 = c_2 = \cdots = c_n = 0$.
  • A set of vectors is linearly dependent if there is a nontrivial solution to $c_1v_1 + c_2v_2 + \cdots + c_nv_n = \mathbf{0}$.
  • If one vector is a multiple of another, the set is dependent.
  • To test independence, solve the homogeneous system $A\mathbf{x} = \mathbf{0}$ using the vectors as columns of $A$.
  • Independent vectors add new directions; dependent vectors do not.
  • The span of vectors is all linear combinations of them.
  • A basis is a set that is both independent and spans the space.
  • The dimension of a space is the number of vectors in a basis.
  • In $\mathbb{R}^n$, any set with more than $n$ vectors must be dependent.
  • A set can span a space and still be dependent, or be independent and still fail to span the whole space.

Practice Quiz

5 questions to test your understanding