4. Statistics and Probability

Expected Values

Expected Values ๐Ÿ“Š

students, imagine playing a game where you win some money, but not every time. How do you decide whether the game is fair or likely to help you over time? The answer is the expected value. In statistics and probability, expected value is a way to predict the long-run average outcome of a random process. It does not tell you what will happen in one trial. Instead, it tells you what you would expect if the same situation were repeated many times.

In this lesson, you will learn the main ideas and terminology behind expected value, how to calculate it, and why it matters in IB Mathematics Analysis and Approaches SL. You will also see how expected value connects to probability distributions, decision-making, and real-world situations like games, insurance, and quality control ๐ŸŽฏ

What Expected Value Means

The expected value of a random variable is the weighted average of all possible outcomes, where each outcome is weighted by its probability. If a random variable $X$ can take values $x_1, x_2, x_3, \dots$, with probabilities $P(X=x_1), P(X=x_2), P(X=x_3), \dots$, then the expected value is

$$E(X)=\sum x_iP(X=x_i)$$

This formula is one of the most important in the topic of probability distributions. It says that each outcome contributes according to how likely it is.

For example, suppose a spinner lands on $1$ with probability $0.2$, on $2$ with probability $0.5$, and on $5$ with probability $0.3$. Then

$$E(X)=(1)(0.2)+(2)(0.5)+(5)(0.3)$$

$$E(X)=0.2+1.0+1.5=2.7$$

So the expected value is $2.7$. students, this does not mean the spinner will land on $2.7$. It means that over many spins, the average result will be close to $2.7$.

Expected value is also sometimes called the mean of a discrete random variable. In IB work, you should know that expected value is a key summary measure, just like the mean is for a data set.

Why Expected Value Matters

Expected value helps us make sense of uncertainty. In real life, many situations involve choices where the result is not certain. For example, a company may decide whether to produce a product, an insurance company may estimate future payouts, or a school fundraiser may choose a game with prizes. Expected value gives a way to compare these situations objectively.

Think about a simple game in which you pay $\$2 to play. If you roll a fair die, you win $\$5$ only if you roll a $6, and otherwise you win nothing. The possible gains are:

  • gain $=\$3$ if you roll a $6
  • gain $=-\$2 if you roll anything else

The probabilities are $\frac{1}{6}$ and $\frac{5}{6}$. The expected gain is

$$E(X)=(3)\left(\frac{1}{6}\right)+(-2)\left(\frac{5}{6}\right)$$

$$E(X)=\frac{3}{6}-\frac{10}{6}=-\frac{7}{6}$$

This means the player expects to lose about $\$1.17 per game in the long run. The game is not fair to the player. A fair game would have expected gain $0$.

This example shows why expected value is useful. Even if a game has a big prize, it may still be unfavorable if the chance of winning is too small.

Calculating Expected Value from a Table

In IB Mathematics Analysis and Approaches SL, you may be given a probability table. The method is always similar:

  1. List each possible value of the random variable.
  2. Multiply each value by its probability.
  3. Add all the products.

Suppose $X$ is the number of goals scored by a team in a match, with the distribution below:

  • $P(X=0)=0.1$
  • $P(X=1)=0.3$
  • $P(X=2)=0.4$
  • $P(X=3)=0.2$

Then

$$E(X)=(0)(0.1)+(1)(0.3)+(2)(0.4)+(3)(0.2)$$

$$E(X)=0+0.3+0.8+0.6=1.7$$

So the expected number of goals is $1.7$.

Notice that the probabilities add to $1$:

$$0.1+0.3+0.4+0.2=1$$

This must always happen in a valid probability distribution. If the total is not $1$, something is wrong with the table.

A useful idea is that expected value is a summary of the whole distribution. It uses every outcome and every probability, so it reflects the shape of the distribution better than just one value can.

Expected Value of a Discrete Random Variable

Expected value is especially important for discrete probability distributions, where the random variable takes countable values such as $0,1,2,3,\dots$.

A common IB example is a binomial random variable. If $X\sim B(n,p)$, then the expected value is

$$E(X)=np$$

Here $n$ is the number of trials and $p$ is the probability of success on each trial.

For example, if a student answers $10$ multiple-choice questions and each question has probability $0.7$ of being correct, then the expected number correct is

$$E(X)=10(0.7)=7$$

So over many such tests, the average number correct would be about $7$.

This formula is very helpful because it saves time compared with listing every possible value. However, students, you should still understand where the formula comes from: it is based on the same weighted-average idea as the table method.

Expected Value and Long-Run Average

One of the most important ideas in probability is that expected value is a long-run average, not a guaranteed result. A single result can be very different from the expectation.

For example, suppose a raffle ticket has a $1\%$ chance of winning $\$100$ and a $99\%$ chance of winning $\$0$. The expected winnings are

$$E(X)=(100)(0.01)+(0)(0.99)=1$$

So the expected value is $\$1. But if students buys one ticket, the most likely result is still $\$0.

This is why expected value is useful for large numbers of repetitions. If many people buy tickets, the average payout per ticket will tend toward the expected value.

In statistics, this idea is connected to the broader notion of variation and prediction. Even when results are random, patterns emerge over many trials.

Expected Value in Fair Games and Decision-Making

Expected value is often used to decide whether a game or situation is fair. A fair game has expected value $0$ for the player, assuming gain and loss are measured from the player's point of view.

Suppose a carnival game costs $\$4 to play. You draw one card from a shuffled deck. If the card is a heart, you win $\$10; otherwise, you win nothing. There are $13$ hearts in a $52$-card deck, so

$$P(\text{heart})=\frac{13}{52}=\frac{1}{4}$$

$$P(\text{not heart})=\frac{3}{4}$$

Your net gain is $\$6$ if you win and $-\$4$ if you lose. So

$$E(X)=(6)\left(\frac{1}{4}\right)+(-4)\left(\frac{3}{4}\right)$$

$$E(X)=1.5-3=-1.5$$

The expected gain is $-\$1.50, so the game is not fair to the player.

In real life, businesses use expected value to estimate profit and risk. Insurance companies use it to predict average claims. Governments and scientists also use expected value when evaluating uncertain outcomes.

Connection to Continuous Random Variables

Expected value is not only for discrete variables. For a continuous random variable with probability density function $f(x)$, the expected value is

$$E(X)=\int_{-\infty}^{\infty} xf(x)\,dx$$

At IB SL level, you may mainly work with discrete distributions, but it is helpful to know that the same idea extends to continuous models. The discrete formula uses a sum, while the continuous version uses an integral.

This shows the deeper unity of statistics and probability: whether values are countable or continuous, expected value still represents a weighted average.

Common Mistakes to Avoid

A very common mistake is to think expected value is the most likely outcome. That is not always true. The expected value can even be a number that never occurs in practice, such as $2.7$ on a spinner with outcomes $1$, $2$, and $5$.

Another mistake is forgetting to use net gain or net loss in game questions. If a game costs money to play, the cost must be included in the calculation.

Also be careful to check probabilities. They must sum to $1$, and each probability must lie between $0$ and $1$. If using a binomial model, make sure the conditions are appropriate: a fixed number of trials, only two outcomes per trial, constant probability of success, and independence.

Finally, remember that expected value describes long-term behavior. One result is not enough to test it.

Conclusion

Expected value is a central idea in Statistics and Probability because it summarizes a random variable using a weighted average of outcomes. It helps you understand chance in practical situations, compare games, and analyze distributions. In IB Mathematics Analysis and Approaches SL, you should be able to calculate expected values from tables, use formulas like $E(X)=np$ for binomial distributions, and explain what the result means in context.

For students, the key message is simple: expected value is not about predicting one exact outcome. It is about understanding what happens on average over many repetitions. That makes it a powerful tool for reasoning in probability, data analysis, and decision-making ๐ŸŒŸ

Study Notes

  • Expected value is the weighted average of a random variableโ€™s outcomes.
  • For a discrete random variable, $E(X)=\sum x_iP(X=x_i)$.
  • For a binomial random variable, $E(X)=np$.
  • Expected value gives the long-run average, not the result of one trial.
  • A fair game has expected value $0$ for the player.
  • When finding expected gain or loss, include all costs and prizes.
  • Probabilities in a valid distribution must add to $1$.
  • Expected value connects directly to probability distributions and decision-making.
  • The continuous version uses $E(X)=\int_{-\infty}^{\infty} xf(x)\,dx$.
  • In IB Mathematics Analysis and Approaches SL, expected value is an important tool for interpreting random situations.

Practice Quiz

5 questions to test your understanding

Expected Values โ€” IB Mathematics Analysis And Approaches SL | A-Warded