7. Recurrence Relations

Applications To Algorithmic Thinking

Applications of Recurrence Relations in Algorithmic Thinking

Imagine you are building an app, designing a search feature, or writing code to sort a huge list of names 📱💻. In computer science, we often want to know how long an algorithm takes as the input gets larger. That is where recurrence relations become powerful. students, this lesson shows how recurrence relations help us describe and analyze algorithms step by step.

Introduction: Why algorithms lead to recurrences

An algorithm is a step-by-step procedure for solving a problem. Many algorithms solve a big problem by breaking it into smaller problems, solving those smaller problems, and then combining the answers. This pattern naturally creates a recurrence relation.

A recurrence relation is an equation that defines a sequence using earlier terms. In algorithm analysis, the sequence usually represents the running time, number of operations, or amount of memory used for input size $n$.

By the end of this lesson, students, you should be able to:

  • explain how recurrence relations describe algorithm behavior,
  • use recurrence reasoning to analyze simple algorithms,
  • connect algorithmic thinking to the broader study of recurrence relations,
  • interpret examples from real programs and computer science,
  • summarize why recurrences are useful for measuring efficiency.

How algorithms create recurrence relations

Many algorithms are recursive, meaning they call themselves on smaller inputs. Even some non-recursive algorithms can still be described with recurrences if they repeatedly process smaller parts of a problem.

A recurrence relation for running time often looks like this:

$$T(n)=T(n-1)+c$$

or

$$T(n)=2T\left(\frac{n}{2}\right)+cn$$

Here, $T(n)$ is the running time for input size $n$, and $c$ is a constant representing extra work done at each step.

The first form says: solve a problem of size $n$ by solving a problem of size $n-1$ and doing a constant amount of extra work. The second form says: solve a problem of size $n$ by solving two smaller subproblems of size $\frac{n}{2}$ each, then combining the results with $cn$ extra work.

These formulas matter because they let us predict how an algorithm grows as $n$ increases. That growth is often more important than the exact number of operations for a single small input.

Example 1: Linear search

Suppose you search for a name in a list by checking one name at a time from top to bottom 🔎. In the worst case, the name is at the end or not present at all.

Let $T(n)$ be the number of checks needed for a list of size $n$. Then:

$$T(n)=T(n-1)+1$$

with base case

$$T(1)=1$$

This means that searching a list of size $n$ takes one check plus the time to search a list of size $n-1$.

If we expand the recurrence, we get:

$$T(n)=T(n-1)+1$$

$$=T(n-2)+1+1$$

$$=T(n-3)+1+1+1$$

Continuing until the base case:

$$T(n)=T(1)+(n-1)$$

Since $T(1)=1$, we have:

$$T(n)=n$$

So linear search has linear running time. This is written as $O(n)$, which means the running time grows proportionally with the input size.

This is a simple example of how recurrence relations help us describe algorithmic thinking: we break the process into repeated smaller steps and count the cost of each step.

Example 2: Binary search

Binary search is used on a sorted list. Instead of checking every item, it compares the target with the middle item and then continues in only one half of the list 🎯.

Let $T(n)$ be the number of comparisons needed. Each step reduces the problem to half the size, so the recurrence is:

$$T(n)=T\left(\frac{n}{2}\right)+1$$

with base case

$$T(1)=1$$

Why does this happen? Because one comparison is made at each step, and after that only one half of the list remains.

To understand the growth, think about how many times you can divide $n$ by $2$ before reaching $1$:

$$n,\ \frac{n}{2},\ \frac{n}{4},\ \frac{n}{8},\dots,1$$

If after $k$ steps the size becomes $1$, then:

$$\frac{n}{2^k}=1$$

So:

$$n=2^k$$

Taking logarithms gives:

$$k=\log_2 n$$

Therefore, binary search takes $O(\log n)$ time. This is much faster than linear search for large lists.

This example shows an important idea in algorithmic thinking: a recurrence relation can reveal whether an algorithm is efficient, not just how it works.

Example 3: Merge sort

Merge sort is a sorting algorithm that splits a list into two halves, sorts each half, and then merges the two sorted halves into one sorted list 🧩.

Let $T(n)$ be the running time for a list of size $n$. Then the recurrence is:

$$T(n)=2T\left(\frac{n}{2}\right)+cn$$

The term $2T\left(\frac{n}{2}\right)$ appears because the algorithm solves two subproblems, each of size $\frac{n}{2}$. The term $cn$ represents the work needed to merge the two sorted halves.

This recurrence is important because it models a divide-and-conquer algorithm. Divide-and-conquer means:

  1. divide the problem into smaller pieces,
  2. solve each piece,
  3. combine the results.

To estimate the running time, we can expand the recurrence:

$$T(n)=2T\left(\frac{n}{2}\right)+cn$$

$$=2\left(2T\left(\frac{n}{4}\right)+c\frac{n}{2}\right)+cn$$

$$=4T\left(\frac{n}{4}\right)+2cn+cn$$

$$=4T\left(\frac{n}{4}\right)+3cn$$

After $k$ steps, the subproblem size becomes $\frac{n}{2^k}$. When this reaches $1$, we get $k=\log_2 n$. The total cost across levels adds up to about $cn\log_2 n$, so merge sort runs in $O(n\log n)$ time.

This is a classic example of recurrence relations in algorithmic thinking because it links the structure of the algorithm to its efficiency.

Why recurrence relations matter in computing

Recurrence relations are useful because they help programmers and computer scientists answer questions such as:

  • How will an algorithm behave as the input grows?
  • Is this method fast enough for large data sets?
  • Can we improve the algorithm by changing how the problem is divided?

For example, if one algorithm runs in $O(n^2)$ time and another runs in $O(n\log n)$ time, the second one is usually much faster for large $n$. This matters in tasks like sorting contacts, analyzing social media data, or searching through a database.

Recurrence relations also help in designing algorithms. If you know a method creates a recurrence that grows too quickly, you may look for a better strategy. In this way, recurrence relations are not only tools for analysis but also tools for decision-making in algorithm design.

Solving simple recurrences in algorithm contexts

A few common recurrence patterns appear often in algorithm analysis.

If

$$T(n)=T(n-1)+c$$

then the solution is linear:

$$T(n)=O(n)$$

If

$$T(n)=T\left(\frac{n}{2}\right)+c$$

then the solution is logarithmic:

$$T(n)=O(\log n)$$

If

$$T(n)=2T\left(\frac{n}{2}\right)+cn$$

then the solution is:

$$T(n)=O(n\log n)$$

These patterns are valuable because many exam questions and real algorithm problems use them directly. students, learning these forms helps you recognize the structure quickly and avoid counting work from scratch every time.

Connecting recurrence relations to broader discrete mathematics

Recurrence relations are part of discrete mathematics because they describe sequences defined by rules. In algorithmic thinking, the sequence may represent time, number of comparisons, memory usage, or number of recursive calls.

This topic connects to several bigger ideas:

  • sequences and patterns,
  • induction, which is often used to prove recurrence formulas,
  • growth rates such as linear, logarithmic, and quadratic growth,
  • divide-and-conquer strategies in computer science.

So when you study recurrence relations, you are not just learning formulas. You are learning a way to model repeated processes and predict how they scale.

Conclusion

Recurrence relations are a key tool for understanding algorithms. They describe how the work on a problem depends on smaller versions of the same problem. students, by using recurrences, you can analyze running time, compare algorithms, and understand why some methods are efficient while others are not. Linear search, binary search, and merge sort show how different recurrences lead to very different performance results. In discrete mathematics, this topic connects sequences and algebraic reasoning to real computing problems, making recurrence relations especially useful for algorithmic thinking.

Study Notes

  • A recurrence relation defines a sequence using earlier terms.
  • In algorithms, $T(n)$ often represents running time for input size $n$.
  • Linear search can be modeled by $T(n)=T(n-1)+1$, which gives $O(n)$.
  • Binary search can be modeled by $T(n)=T\left(\frac{n}{2}\right)+1$, which gives $O(\log n)$.
  • Merge sort can be modeled by $T(n)=2T\left(\frac{n}{2}\right)+cn$, which gives $O(n\log n)$.
  • Divide-and-conquer algorithms often create recurrences because they split a problem into smaller subproblems.
  • Recurrence relations help compare algorithms by showing how their running time grows as $n$ increases.
  • Solving recurrences supports both algorithm analysis and algorithm design.
  • Recurrence relations connect discrete mathematics with real-world computer science tasks like searching and sorting.
  • Recognizing common recurrence forms makes it easier to reason about efficiency and scalability.

Practice Quiz

5 questions to test your understanding

Applications To Algorithmic Thinking — Discrete Mathematics | A-Warded