11. Discrete Probability

Expected Value

Expected Value in Discrete Probability 🎲

Introduction: Why Expected Value Matters

students, imagine you play a game where you can win some money, lose some money, or break even. Before playing, you want to know: is the game usually good for you or not? That is exactly where expected value helps. It gives a way to predict the long-run average result of a random situation.

In discrete probability, outcomes are countable, like the result of rolling a die, drawing a card, or flipping a coin a certain number of times. Expected value turns these outcomes into a single number that summarizes the “average” outcome over many repeated trials 📊.

By the end of this lesson, you should be able to:

  • explain what expected value means,
  • calculate expected value for discrete random variables,
  • connect expected value to sample spaces, probabilities, and independence,
  • use expected value to solve real-world style problems.

What Is Expected Value?

Expected value is the weighted average of all possible values of a discrete random variable. The weights are the probabilities of those values.

If a discrete random variable $X$ can take values $x_1, x_2, x_3, \dots, x_n$ with probabilities $P(X=x_1), P(X=x_2), \dots, P(X=x_n)$, then the expected value of $X$ is

$$E(X)=\sum_{i=1}^{n} x_i P(X=x_i).$$

This formula means:

  • multiply each possible value by its probability,
  • add all those products.

Expected value is not always a value that can actually happen in one trial. For example, if you roll one fair die, the expected value is $3.5$, even though you cannot roll a $3.5$. That is because expected value describes the long-run average, not one single outcome.

A helpful way to think about it is this: if you repeat a random experiment many times, the average result will get closer and closer to the expected value. This is why expected value is so important in probability, statistics, economics, and decision-making.

Example 1: A Fair Die 🎲

Let $X$ be the number shown when you roll a fair six-sided die. The possible values are $1,2,3,4,5,6$, and each has probability $$\frac{1}{6}$.

$$E(X)=1\left(\frac{1}{6}\right)+2\left(\frac{1}{6}\right)+3\left(\frac{1}{6}\right)+4\left(\frac{1}{6}\right)+5\left(\frac{1}{6}\right)+6\left(\frac{1}{6}\right).$$

$$E(X)=\frac{1+2+3+4+5+6}{6}=\frac{21}{6}=3.5.$$

So the expected value of one die roll is $3.5$.

That does not mean the die will land on $3.5$. It means that over many rolls, the average result approaches $3.5$.

How to Build an Expected Value Problem

To find expected value in discrete probability, it helps to follow a clear process:

  1. Define the random variable $X$. Decide what quantity you are measuring.
  2. List the possible values of $X$.
  3. Find the probability of each value.
  4. Use the formula $E(X)=\sum xP(X=x)$.
  5. Interpret the result in context.

This process connects expected value to the broader idea of a sample space. A sample space is the set of all possible outcomes. From the sample space, you can define events and random variables, then compute probabilities and expected value.

Example 2: A Simple Game 💰

Suppose you pay $1$ dollar to play a game. You flip a fair coin once:

  • if it lands heads, you win $3$ dollars,
  • if it lands tails, you win nothing.

Let $X$ be the net gain after paying to play. Then:

  • if you win, your net gain is $3-1=2$ dollars,
  • if you lose, your net gain is $0-1=-1$ dollar.

So the possible values of $X$ are $2$ and $-1$, each with probability $\frac{1}{2}$.

$$E(X)=2\left(\frac{1}{2}\right)+(-1)\left(\frac{1}{2}\right).$$

$$E(X)=1-\frac{1}{2}=\frac{1}{2}.$$

The expected net gain is $\frac{1}{2}$ dollar per play.

This does not mean every player wins $0.50$ each time. It means that if the game were repeated many times, the average profit would be about $0.50$ per play.

Expected Value and Real-World Meaning

Expected value is useful because it helps people compare choices with uncertain results. It is widely used in insurance, business, gambling, and scientific experiments. For example, a company may calculate the expected cost of claims, while a game designer may calculate the expected payout of a prize system.

In real life, expected value often combines chance with money or other measurable quantities. That is why it is sometimes called the “average payoff” of a random situation.

Example 3: Prize Drawing 🎁

A school raffle sells tickets for $2$ each. One ticket wins a $20$ prize, and there are $100$ tickets total. Assume one ticket is selected at random and you own one ticket.

Let $X$ be your net gain.

  • If your ticket wins, your net gain is $20-2=18$.
  • If your ticket loses, your net gain is $0-2=-2$.

The probability of winning is $\frac{1}{100}$, and the probability of losing is $\frac{99}{100}$.

$$E(X)=18\left(\frac{1}{100}\right)+(-2)\left(\frac{99}{100}\right).$$

$$E(X)=\frac{18}{100}-\frac{198}{100}=-\frac{180}{100}=-1.8.$$

The expected net gain is $-1.8$ dollars. That means that, on average, a ticket buyer loses $1.80$ per ticket in the long run.

This example shows why expected value is important for understanding whether a game or investment is favorable.

Expected Value, Conditional Probability, and Independence

Expected value often works together with other ideas in discrete probability, especially conditional probability and independence.

Conditional Probability

Sometimes the probability of an outcome changes because some information is known. In that case, we use conditional probability.

If $A$ and $B$ are events and $P(B)>0$, then

$$P(A\mid B)=\frac{P(A\cap B)}{P(B)}.$$

This can affect expected value because the probabilities used in the calculation may change once new information is given.

For example, suppose a card is drawn from a standard deck. If you know the card is a face card, the probability of drawing a king is no longer $\frac{4}{52}$ but instead $\frac{4}{12}$, because the sample space has changed to only face cards.

If the value of a random variable depends on that new information, then the expected value must be recalculated using the conditional probabilities.

Independence

Two events are independent if knowing one occurs does not change the probability of the other. If $A$ and $B$ are independent, then

$$P(A\cap B)=P(A)P(B).$$

Independence is useful because it often simplifies calculations involving expected value, especially when repeated trials are involved.

For example, if you flip a fair coin twice, the result of the first flip does not affect the second flip. These trials are independent. If a random variable counts the number of heads, each flip contributes separately to the total expected value.

Example 4: Number of Heads in Two Flips 🪙

Let $X$ be the number of heads in two fair coin flips.

The possible values are $0,1,2.

  • $P(X=0)=\frac{1}{4}$
  • $P(X=1)=\frac{1}{2}$
  • $P(X=2)=\frac{1}{4}$

Now compute the expected value:

$$E(X)=0\left(\frac{1}{4}\right)+1\left(\frac{1}{2}\right)+2\left(\frac{1}{4}\right).$$

$$E(X)=0+\frac{1}{2}+\frac{1}{2}=1.$$

So the expected number of heads is $1$.

This result also makes sense because each fair coin flip has expected heads $\frac{1}{2}$, and with two independent flips, the total expected value is $\frac{1}{2}+\frac{1}{2}=1$.

A Powerful Shortcut: Linearity of Expectation

One of the most useful facts in discrete probability is linearity of expectation. It says that for random variables $X$ and $Y$,

$$E(X+Y)=E(X)+E(Y).$$

This works even if $X$ and $Y$ are not independent.

For many problems, this shortcut makes expected value much easier to compute. Instead of listing every outcome of a complicated sample space, you can break the problem into smaller parts.

Example 5: Counting Matches

Suppose a student guesses on three true-or-false questions. Let $X$ be the number of correct answers.

For each question, define an indicator variable $X_i$ where:

  • $X_i=1$ if the $i$th answer is correct,
  • $X_i=0$ if it is incorrect.

Then $X=X_1+X_2+X_3$.

Since each question has probability $\frac{1}{2}$ of being correct,

$$E(X_i)=1\left(\frac{1}{2}\right)+0\left(\frac{1}{2}\right)=\frac{1}{2}.$$

So,

$$E(X)=E(X_1)+E(X_2)+E(X_3)=\frac{1}{2}+\frac{1}{2}+\frac{1}{2}=\frac{3}{2}.$$

The expected number correct is $1.5$.

This is a great example of how expected value helps even when the full sample space is large.

Common Mistakes to Avoid

A frequent mistake is confusing expected value with the most likely outcome. The most likely value is the mode, not the expected value.

Another mistake is forgetting to use the net gain in games and real-life situations. If a game costs money to enter, the final expected value should include that cost.

Also, be careful to use the correct probabilities. If the problem gives conditional information, the probabilities may change.

Finally, remember that expected value is a long-run average. A single trial may be very different from the expected value.

Conclusion

Expected value is a central idea in discrete probability because it gives a single number that summarizes the average outcome of a random process. It is found by multiplying each possible value by its probability and adding the results. students, this idea connects directly to sample spaces, conditional probability, and independence, and it is especially useful in games, decision-making, and real-world modeling.

When you understand expected value, you can analyze whether a random situation is fair, favorable, or unfavorable. You can also use it as a tool to break complex probability problems into smaller, manageable pieces. That makes expected value one of the most practical ideas in discrete mathematics.

Study Notes

  • Expected value $E(X)$ is the weighted average of all values of a discrete random variable $X$.
  • The formula is $E(X)=\sum xP(X=x)$.
  • Expected value describes a long-run average, not necessarily an outcome that occurs in one trial.
  • A fair die has expected value $3.5$.
  • In games, use net gain when calculating expected value.
  • Conditional probability can change the probabilities used in expected value.
  • Independent events often make expected value calculations easier.
  • Linearity of expectation says $E(X+Y)=E(X)+E(Y)$, even when $X$ and $Y$ are not independent.
  • Expected value is a major tool for interpreting fairness and average payoff in discrete probability.

Practice Quiz

5 questions to test your understanding

Expected Value — Discrete Mathematics | A-Warded