3. Linear Algebra

Linear Transformations

Study linear maps between vector spaces, kernel and image, matrix representation, change of basis, and rank-nullity theorem.

Linear Transformations

Hey students! 👋 Today we're diving into one of the most fundamental concepts in linear algebra: linear transformations. Think of these as mathematical "machines" that take vectors from one space and transform them into vectors in another space, all while preserving the essential structure of addition and scalar multiplication. By the end of this lesson, you'll understand how these transformations work, how to represent them with matrices, and discover the powerful rank-nullity theorem that connects everything together. This knowledge forms the backbone of computer graphics, data science, and engineering applications! 🚀

What Are Linear Transformations?

A linear transformation (also called a linear map) is a special type of function between vector spaces that preserves the operations of vector addition and scalar multiplication. If we have vector spaces V and W, a function T: V → W is linear if it satisfies two key properties:

  1. Additivity: T(u + v) = T(u) + T(v) for all vectors u, v in V
  2. Homogeneity: T(cu) = cT(u) for all vectors u in V and scalars c

These might seem like abstract conditions, but they're incredibly powerful! They mean that linear transformations preserve the "linear structure" of vector spaces.

Let's look at a real-world example 🌟. Imagine you're working with computer graphics and want to rotate all points in a 2D image by 30 degrees counterclockwise around the origin. This rotation is a linear transformation because:

  • If you rotate two points and then add their coordinates, you get the same result as first adding the coordinates and then rotating
  • If you scale a point by some factor and then rotate, it's the same as rotating first and then scaling

The rotation transformation can be represented by the matrix:

$$T = \begin{pmatrix} \cos(30°) & -\sin(30°) \\ \sin(30°) & \cos(30°) \end{pmatrix} = \begin{pmatrix} \frac{\sqrt{3}}{2} & -\frac{1}{2} \\ \frac{1}{2} & \frac{\sqrt{3}}{2} \end{pmatrix}$$

Matrix Representation of Linear Transformations

Here's where things get really exciting! Every linear transformation between finite-dimensional vector spaces can be represented by a matrix. This is like having a universal translator that converts abstract transformations into concrete numerical operations.

If T: ℝⁿ → ℝᵐ is a linear transformation, then there exists an m × n matrix A such that T(x) = Ax for every vector x in ℝⁿ. The columns of this matrix A are simply the images of the standard basis vectors under the transformation T.

For example, consider a linear transformation T: ℝ² → ℝ³ where:

$- T(1,0) = (2, -1, 3)$

$- T(0,1) = (0, 4, -2)$

The matrix representation is:

$$A = \begin{pmatrix} 2 & 0 \\ -1 & 4 \\ 3 & -2 \end{pmatrix}$$

This matrix representation is incredibly useful because it allows us to compute T(x) for any vector x by simple matrix multiplication! 💡

Kernel and Image: The Heart of Linear Transformations

Two fundamental concepts help us understand what a linear transformation does: the kernel and the image.

The kernel (also called null space) of a linear transformation T: V → W is the set of all vectors in V that get mapped to the zero vector in W:

$$\ker(T) = \{v \in V : T(v) = 0_W\}$$

The image (also called range) of T is the set of all possible outputs:

$$\text{im}(T) = \{T(v) : v \in V\} = \{w \in W : w = T(v) \text{ for some } v \in V\}$$

Think of the kernel as the "blind spot" of the transformation - vectors that completely disappear. The image represents all the places the transformation can reach. In our graphics example, if we had a transformation that projected 3D objects onto a 2D screen, the kernel would contain vectors pointing directly toward or away from the screen (they'd appear as points), while the image would be the entire 2D screen space.

Both the kernel and image are subspaces of their respective vector spaces, which means they're closed under addition and scalar multiplication. This structural property makes them incredibly important for understanding the behavior of linear transformations.

Change of Basis and Coordinate Systems

Sometimes we want to represent the same linear transformation using different coordinate systems. This is like describing the same physical transformation from different perspectives - the transformation itself doesn't change, but its matrix representation does.

If we have a linear transformation T: V → W and we choose different bases for V and W, we get different matrix representations. The process of converting between these representations is called a change of basis.

Suppose we have bases B₁ = {v₁, v₂, ..., vₙ} for V and B₂ = {w₁, w₂, ..., wₘ} for W. If [T]_B represents the matrix of T with respect to these bases, and we want to change to new bases B₁' and B₂', then:

$$[T]_{B₂',B₁'} = P₂⁻¹[T]_{B₂,B₁}P₁$$

where P₁ and P₂ are the change of basis matrices.

This concept is crucial in applications like principal component analysis (PCA) in data science, where we transform data to new coordinate systems that reveal hidden patterns! 📊

The Rank-Nullity Theorem: A Beautiful Connection

Now we arrive at one of the most elegant results in linear algebra: the rank-nullity theorem. This theorem creates a beautiful relationship between the dimensions of the kernel and image.

For a linear transformation T: V → W where V is finite-dimensional:

$$\dim(V) = \dim(\ker(T)) + \dim(\text{im}(T))$$

The rank of T is defined as dim(im(T)), and the nullity of T is dim(ker(T)). So we can write:

$$\dim(V) = \text{nullity}(T) + \text{rank}(T)$$

This theorem tells us that the dimension of the domain is always split between the kernel and the image - no dimension is ever "lost," just redistributed!

For example, if we have a linear transformation from ℝ⁵ to ℝ³ with a 2-dimensional kernel, then the image must be 3-dimensional (since 5 = 2 + 3). This means the transformation is surjective (onto) because the image fills the entire codomain ℝ³.

The rank-nullity theorem has profound implications:

  • A transformation is injective (one-to-one) if and only if its kernel contains only the zero vector
  • A transformation between spaces of the same dimension is bijective if and only if it's either injective or surjective
  • The rank of a matrix equals the dimension of its column space and also equals the dimension of its row space

Conclusion

Linear transformations are the fundamental building blocks of linear algebra, providing a framework for understanding how vector spaces relate to each other. We've seen how every linear transformation can be represented by a matrix, making abstract concepts concrete and computable. The kernel and image give us insight into what the transformation does to the structure of vector spaces, while the rank-nullity theorem provides a beautiful relationship that connects dimensions across the transformation. Whether you're rotating graphics, analyzing data, or solving systems of equations, linear transformations provide the mathematical foundation that makes it all possible! 🎯

Study Notes

• Linear Transformation Definition: A function T: V → W that preserves addition and scalar multiplication: T(u + v) = T(u) + T(v) and T(cu) = cT(u)

• Matrix Representation: Every linear transformation T: ℝⁿ → ℝᵐ can be represented as T(x) = Ax where A is an m × n matrix

• Kernel Formula: $\ker(T) = \{v \in V : T(v) = 0_W\}$ - the set of vectors that map to zero

• Image Formula: $\text{im}(T) = \{T(v) : v \in V\}$ - the set of all possible outputs

• Rank-Nullity Theorem: $\dim(V) = \dim(\ker(T)) + \dim(\text{im}(T))$ or equivalently $\dim(V) = \text{nullity}(T) + \text{rank}(T)$

• Rank: The dimension of the image, rank(T) = dim(im(T))

• Nullity: The dimension of the kernel, nullity(T) = dim(ker(T))

• Injectivity Condition: T is injective (one-to-one) ⟺ ker(T) = {0} ⟺ nullity(T) = 0

• Change of Basis Formula: $[T]_{B₂',B₁'} = P₂⁻¹[T]_{B₂,B₁}P₁$ where P₁, P₂ are change of basis matrices

• Matrix Columns: The columns of the matrix representation are T applied to the standard basis vectors

Practice Quiz

5 questions to test your understanding

Linear Transformations — Mathematics | A-Warded