4. Matrix Algebra

Matrix Equations And Properties

Matrix Equations and Properties

In this lesson, students, you will learn how matrix equations work and why the properties of matrices matter in linear algebra. Matrix algebra is the language that helps us organize and solve systems of equations, represent transformations, and model real-world situations like computer graphics, network flows, and data analysis 📊.

What is a matrix equation?

A matrix equation is an equation where matrices or vectors are related by matrix operations. One of the most important forms is

$$A\mathbf{x}=\mathbf{b}$$

where $A$ is a matrix, $\mathbf{x}$ is an unknown vector, and $\mathbf{b}$ is a known vector. This equation is the matrix version of a system of linear equations.

For example, the system

$$\begin{aligned}

$2x+y&=5\\$

$3x-y&=4$

\end{aligned}$$

can be written as

$$\begin{bmatrix}2&1\\3&-1\end{bmatrix}\begin{bmatrix}x\y\end{bmatrix}=\begin{bmatrix}5\\4\end{bmatrix}.$$

This compact form is useful because it lets us use matrix methods to study many equations at once. Instead of looking at each equation separately, we can treat the whole system as one structured object.

A matrix equation can have:

  • one solution,
  • no solution,
  • or infinitely many solutions.

Which case happens depends on the matrix $A$ and the vector $\mathbf{b}$. If $A$ is square and invertible, then the equation $A\mathbf{x}=\mathbf{b}$ has exactly one solution for every $\mathbf{b}$, namely

$$\mathbf{x}=A^{-1}\mathbf{b}.$$

This is a key idea in matrix algebra because it connects matrix equations to matrix inverses.

Solving matrix equations using inverse matrices

If a matrix $A$ has an inverse, solving $A\mathbf{x}=\mathbf{b}$ is straightforward. Multiply both sides on the left by $A^{-1}$:

$$A^{-1}A\mathbf{x}=A^{-1}\mathbf{b}.$$

Since $A^{-1}A=I$, where $I$ is the identity matrix, this becomes

$$I\mathbf{x}=A^{-1}\mathbf{b},$$

so

$$\mathbf{x}=A^{-1}\mathbf{b}.$$

This works because the identity matrix acts like the number $1$ in multiplication: it leaves vectors unchanged.

Example 🌟

Let

$$A=\begin{bmatrix}1&2\\3&5\end{bmatrix}, \quad \mathbf{b}=\begin{bmatrix}4\\11\end{bmatrix}.$$

We want to solve

$$\begin{bmatrix}1&2\\3&5\end{bmatrix}\mathbf{x}=\begin{bmatrix}4\\11\end{bmatrix}.$$

The inverse of $A$ is

$$A^{-1}=\begin{bmatrix}-5&2\\3&-1\end{bmatrix},$$

so

$$\mathbf{x}=A^{-1}\mathbf{b}=\begin{bmatrix}-5&2\\3&-1\end{bmatrix}\begin{bmatrix}4\\11\end{bmatrix}=\begin{bmatrix}-20+22\\12-11\end{bmatrix}=\begin{bmatrix}2\\1\end{bmatrix}.$$

So the solution is $x=2$ and $y=1$.

This is a real example of how matrix equations simplify solving systems. However, not every matrix has an inverse. If $A$ is singular, then $A^{-1}$ does not exist, and other methods such as row reduction are needed.

Key properties of matrix operations

Matrix algebra follows several important properties, but not all the rules from regular number arithmetic carry over exactly. Knowing these properties helps you simplify expressions correctly and avoid mistakes.

Addition and scalar multiplication

If $A$ and $B$ are matrices of the same size, then their sum $A+B$ is formed by adding matching entries. Scalar multiplication means multiplying every entry by a number $c$, giving $cA$.

These operations satisfy familiar rules:

  • $A+B=B+A$
  • $(A+B)+C=A+(B+C)$
  • $c(A+B)=cA+cB$
  • $(c+d)A=cA+dA$
  • $c(dA)=(cd)A$

These are similar to the properties of ordinary numbers, which makes matrix expressions easier to manage.

Example ✏️

If

$$A=\begin{bmatrix}1&0\\2&-1\end{bmatrix}, \quad B=\begin{bmatrix}3&4\\-2&5\end{bmatrix},$$

then

$$A+B=\begin{bmatrix}4&4\\0&4\end{bmatrix}.$$

Also,

$$2A=\begin{bmatrix}2&0\\4&-2\end{bmatrix}.$$

You can check that $2(A+B)=2A+2B$, which shows distributive behavior.

Multiplication is not commutative

One of the most important differences in matrix algebra is that matrix multiplication is usually not commutative. In other words, in general,

$$AB\neq BA.$$

This is a big deal because it means the order matters.

Example 🚦

Let

$$A=\begin{bmatrix}1&2\\0&1\end{bmatrix}, \quad B=\begin{bmatrix}1&0\\3&1\end{bmatrix}.$$

Then

$$AB=\begin{bmatrix}7&2\\3&1\end{bmatrix},$$

but

$$BA=\begin{bmatrix}1&2\\3&7\end{bmatrix}.$$

These are different, so $AB\neq BA$.

This property matters in applications. For example, if one matrix represents a rotation and another represents a stretch, doing the stretch first and the rotation second may produce a different result than reversing the order.

Properties of matrix multiplication

Even though multiplication is not commutative, it does have several useful properties.

Associative property

For matrices with compatible sizes,

$$A(BC)=(AB)C.$$

This means that when multiplying three matrices, the grouping does not change the final result, as long as the order stays the same.

Distributive properties

Matrix multiplication distributes over addition:

$$A(B+C)=AB+AC,$$

and

$$(A+B)C=AC+BC.$$

These rules let you expand and simplify matrix expressions.

Identity matrix property

For a square matrix $A$ of size $n\times n$,

$$AI=IA=A,$$

where $I$ is the $n\times n$ identity matrix.

This property is similar to multiplying a number by $1$.

Inverse property

If $A$ is invertible, then

$$AA^{-1}=A^{-1}A=I.$$

This is what makes inverse matrices so powerful in solving matrix equations.

Zero matrix property

If $0$ is the zero matrix, then

$$A0=0A=0,$$

where the sizes are compatible. This shows that multiplying by the zero matrix gives the zero matrix.

Matrix equations in broader linear algebra

Matrix equations are not just a chapter topic; they connect to the heart of linear algebra. A matrix equation $A\mathbf{x}=\mathbf{b}$ can be viewed in several ways.

First, it represents a system of linear equations. Each row of $A$ gives one equation, and each entry of $\mathbf{x}$ is an unknown variable.

Second, it represents a linear transformation. The matrix $A$ acts on the vector $\mathbf{x}$ to produce the vector $\mathbf{b}$. In this view, solving the equation means finding a vector that gets transformed into a target vector.

Third, it links to column space. The equation $A\mathbf{x}=\mathbf{b}$ has a solution exactly when $\mathbf{b}$ is in the column space of $A$. That means $\mathbf{b}$ can be written as a linear combination of the columns of $A$.

For example, if

$$A=\begin{bmatrix}1&2\\3&6\end{bmatrix},$$

then the second column is $2$ times the first column. The columns are dependent, so $A$ is not invertible. For some vectors $\mathbf{b}$, the equation $A\mathbf{x}=\mathbf{b}$ has no solution, and for others it has infinitely many solutions.

This connection helps explain why matrix properties matter. The structure of the matrix tells us a lot about the behavior of the equation.

How to reason about matrix equations

When working with matrix equations, students, a strong strategy is to check size, structure, and properties.

Ask these questions:

  • Are the matrices compatible for the operation?
  • Is the matrix square?
  • Is it invertible?
  • Can row reduction help?
  • Does the equation represent a system or a transformation?

Example 🧠

Suppose you have

$$A\mathbf{x}=\mathbf{b}$$

with $A$ a $3\times 3$ matrix. If the determinant of $A$ is nonzero, then $A$ is invertible, and the equation has a unique solution for every $\mathbf{b}$. If the determinant is zero, then $A$ is singular, and uniqueness is not guaranteed.

Another useful idea is to rewrite complex expressions carefully. If

$$A\mathbf{x}+\mathbf{c}=\mathbf{b},$$

then subtract $\mathbf{c}$ from both sides to get

$$A\mathbf{x}=\mathbf{b}-\mathbf{c}.$$

If $A^{-1}$ exists, then

$$\mathbf{x}=A^{-1}(\mathbf{b}-\mathbf{c}).$$

This shows how matrix properties help you move from a complicated equation to a solvable one.

Conclusion

Matrix equations and properties are central ideas in matrix algebra because they connect abstract rules with practical problem solving. Matrix equations let us write systems of linear equations in a compact form, and matrix properties tell us how to manipulate those equations correctly. Understanding which properties hold, especially the noncommutative nature of matrix multiplication, helps you work accurately with expressions and solve problems efficiently.

As you continue studying linear algebra, these ideas will appear again in determinants, inverses, vector spaces, and transformations. Mastering matrix equations now gives you a strong foundation for everything that comes next 🚀.

Study Notes

  • A matrix equation often has the form $A\mathbf{x}=\mathbf{b}$.
  • If $A$ is invertible, then the unique solution is $\mathbf{x}=A^{-1}\mathbf{b}$.
  • Matrix addition is commutative: $A+B=B+A$.
  • Matrix multiplication is usually not commutative: $AB\neq BA$ in general.
  • Matrix multiplication is associative: $A(BC)=(AB)C$.
  • Matrix multiplication distributes over addition: $A(B+C)=AB+AC$ and $(A+B)C=AC+BC$.
  • The identity matrix satisfies $AI=IA=A$.
  • The zero matrix satisfies $A0=0A=0$.
  • A matrix equation $A\mathbf{x}=\mathbf{b}$ has a solution exactly when $\mathbf{b}$ is in the column space of $A$.
  • If $A$ is singular, then $A^{-1}$ does not exist and other methods such as row reduction may be needed.
  • Matrix equations connect systems of equations, linear transformations, and column space all at once.
  • Checking matrix size and invertibility is an important first step in solving matrix equations.

Practice Quiz

5 questions to test your understanding