15. Symmetric Matrices, Spectral Ideas, and Applications

Intro To Singular Values (optional Extension)

Intro to Singular Values: A Powerful Look at Matrices 📊

students, in this lesson you will meet a tool from linear algebra that helps us understand matrices in a deeper way than just looking at rows and columns. Singular values tell us how a matrix stretches space, and they show up in image compression, data science, engineering, and more. By the end, you should be able to explain what singular values are, how they connect to symmetric matrices and eigenvalues, and why they matter in real life.

What are singular values? 🌟

A matrix can act like a machine that transforms vectors. Some directions get stretched a lot, some get shrunk, and some may even flip. Singular values measure the amount of stretching a matrix does.

For any real matrix $A$, the singular values are the square roots of the eigenvalues of $A^T A$. These values are always nonnegative. If the singular values of $A$ are $\sigma_1, \sigma_2, \dots, \sigma_r$, then we usually list them in descending order:

$$

$\sigma_1$ \ge $\sigma_2$ \ge $\cdots$ \ge $\sigma$_r \ge 0.

$$

Why use $A^T A$? Because $A^T A$ is symmetric, and symmetric matrices have especially nice eigenvalue properties. This links singular values directly to the bigger topic of symmetric matrices and spectral ideas.

A simple interpretation

Imagine standing on a rubber sheet with arrows drawn on it. If a matrix transforms the sheet, some arrows become longer or shorter. Singular values tell you the sizes of the main stretching effects. The largest singular value gives the maximum stretching factor of the matrix.

For example, if the largest singular value of $A$ is $5$, then there is some direction that gets stretched by a factor of $5$. Another direction might get stretched by only $1$, and another by $0.2$. This gives a clear picture of how strong or weak the transformation is.

Why symmetric matrices matter here 🔍

This lesson fits naturally into the study of symmetric matrices because $A^T A$ is always symmetric. That means we can use the spectral theorem, which says a real symmetric matrix has real eigenvalues and orthonormal eigenvectors.

That is important because the eigenvalues of $A^T A$ are guaranteed to be real and nonnegative. In fact, if $\lambda$ is an eigenvalue of $A^T A$, then

$$

$\lambda \ge 0.$

$$

So the singular values of $A$ are well-defined by

$$

$\sigma_i = \sqrt{\lambda_i},$

$$

where $\lambda_i$ are the eigenvalues of $A^T A$.

This is one of the major bridges between a general matrix and symmetric matrices. Even if $A$ itself is not symmetric, the matrix $A^T A$ is symmetric and reveals important information about $A$.

Key connection to eigenvalues

If $A$ is itself symmetric and positive semidefinite, then its singular values match its eigenvalues. More generally, for a symmetric matrix $A$, the singular values are the absolute values of its eigenvalues. That means singular values focus on size, while eigenvalues can also carry sign.

For example, if a symmetric matrix has eigenvalues $3$ and $-2$, then its singular values are $3$ and $2$.

This helps explain why singular values are useful: they measure magnitude of action even when signs change.

How singular values are found 🧮

To find the singular values of a matrix $A$:

  1. Compute $A^T A$.
  2. Find the eigenvalues of $A^T A$.
  3. Take the square roots of those eigenvalues.

Let’s look at a small example.

Suppose

$$

A = $\begin{pmatrix} 3$ & 0 \ 0 & $1 \end{pmatrix}$.

$$

Then

$$

A^T A = $\begin{pmatrix} 3$ & 0 \ 0 & $1 \end{pmatrix}$^T $\begin{pmatrix} 3$ & 0 \ 0 & $1 \end{pmatrix}$ = $\begin{pmatrix} 9$ & 0 \ 0 & $1 \end{pmatrix}$.

$$

The eigenvalues of $A^T A$ are $9$ and $1$, so the singular values are

$$

$\sigma_1$ = 3, \quad $\sigma_2$ = 1.

$$

That makes sense because the matrix stretches the $x$-direction by $3$ and the $y$-direction by $1$.

A slightly less obvious example

Suppose

$$

A = $\begin{pmatrix} 1$ & 2 \ 0 & $1 \end{pmatrix}$.

$$

Then

$$

A^T A = $\begin{pmatrix} 1$ & 0 \ 2 & $1 \end{pmatrix}$ $\begin{pmatrix} 1$ & 2 \ 0 & $1 \end{pmatrix}$ = $\begin{pmatrix} 1$ & 2 \ 2 & $5 \end{pmatrix}$.

$$

The eigenvalues of this symmetric matrix are the solutions of

$$

$\det$$\left($$\begin{pmatrix} 1$ & 2 \ 2 & $5 \end{pmatrix}$ - $\lambda$ I$\right)$ = 0.

$$

That gives

$$

$(1-\lambda)(5-\lambda)-4=0,$

$$

so

$$

$\lambda^2$ - $6\lambda$ + 1 = 0.

$$

The eigenvalues are

$$

$\lambda = 3 \pm 2\sqrt{2}.$

$$

Therefore the singular values are

$$

$\sigma_1$ = $\sqrt{3+2\sqrt{2}}$, \quad $\sigma_2$ = $\sqrt{3-2\sqrt{2}}$.

$$

Even though this matrix is not symmetric, singular values still give a clean picture of how it stretches space.

What singular values tell us in real life 📷

Singular values are not just a theory topic. They are used in many applications.

1. Image compression

Digital images can be stored as matrices of pixel values. A matrix with many singular values can often be approximated using only the largest few. This reduces storage while keeping the image looking similar.

If the smaller singular values are close to $0$, they contribute less to the overall picture. Keeping only the largest singular values can make compression efficient.

2. Data analysis

In data science, singular values help reveal which directions in the data matter most. Large singular values often correspond to strong patterns in the data. This is useful when looking for trends in survey results, test scores, or measurements from experiments.

3. Stability and error checking

The size of the smallest singular value can show whether a system of equations is sensitive to small changes. If the smallest singular value is near $0$, the matrix is close to being non-invertible, and small errors in data can cause big changes in the answer.

This is important in engineering and computing, where measurements are never perfect.

Singular values and matrix rank 📐

The number of nonzero singular values equals the rank of the matrix. This is a very useful fact.

If a matrix has rank $r$, then it has exactly $r$ nonzero singular values.

For example, if a $3\times 3$ matrix has only two nonzero singular values, then its rank is $2$.

This helps connect singular values to another major idea in linear algebra: how much independent information a matrix contains.

If all singular values are positive, the matrix has full rank. If one or more singular values are $0$, the matrix loses dimension in some direction.

The big picture: why this belongs with spectral ideas 🧠

Spectral ideas in linear algebra focus on eigenvalues, eigenvectors, and how matrices can be understood through special directions. Singular values extend that idea to all matrices, not just symmetric ones.

The reason this works is that the matrix $A^T A$ is symmetric, so the spectral theorem applies. From there, we get nonnegative eigenvalues, and their square roots become the singular values of $A$.

This is a strong example of how one idea in linear algebra supports another. Symmetric matrices give us a reliable path to understanding more general matrices.

A later topic often built on this is the singular value decomposition, or SVD, which writes a matrix in a form involving orthogonal matrices and a diagonal matrix of singular values. Even without going deep into SVD, you can already see the value of singular values as a tool for measuring matrix action.

Conclusion ✅

students, singular values help us measure how a matrix stretches space. They are found from the eigenvalues of $A^T A$, which is symmetric, so this topic connects directly to the study of symmetric matrices and spectral ideas. Singular values are always nonnegative, they reveal the main strength of a matrix, and they are used in areas like image compression, data analysis, and numerical stability.

If you remember one idea, remember this: singular values turn a matrix into a story about stretching, shrinking, and importance of directions. That makes them one of the most practical and powerful tools in linear algebra.

Study Notes

  • Singular values of a matrix $A$ are the square roots of the eigenvalues of $A^T A$.
  • $A^T A$ is always symmetric, so the spectral theorem applies.
  • Singular values are always nonnegative and are usually ordered as $\sigma_1 \ge \sigma_2 \ge \cdots \ge 0$.
  • The largest singular value gives the maximum stretching factor of a matrix.
  • The number of nonzero singular values equals the rank of the matrix.
  • For a symmetric matrix, singular values are the absolute values of its eigenvalues.
  • Singular values help with image compression, data analysis, and studying stability of systems.
  • They connect general matrices to symmetric matrices through the matrix $A^T A$.
  • If the smallest singular value is near $0$, the matrix may be close to losing invertibility.
  • Singular values are a major stepping stone toward the singular value decomposition, or SVD.

Practice Quiz

5 questions to test your understanding