Grade 11

Grade 11Probability and StatisticsProbability


Bayes' Theorem


Bayes' theorem is a fundamental concept in probability and statistics that helps us update our beliefs in the light of new evidence. It is named after the 18th-century English statistician and minister Thomas Bayes. This theorem is used in a variety of fields, including medicine, finance, machine learning, and even day-to-day decision making.

Understanding conditional probability

Before diving into Bayes' theorem, it is important to understand the concept of conditional probability. Conditional probability is the probability of an event occurring, provided that another event has already occurred. It is represented as P(A|B), which is read as "the probability of A given B."

Example of conditional probability

Imagine you have a deck of cards. You want to know the probability of drawing a king, given that you have already drawn a face card. Let's note that there are 3 kings left out of the remaining 11 face cards.

If you draw a face card the probability of drawing a King is calculated as follows:

P(kings | face cards) = number of kings / number of face cards = 3 / 11 ≈ 0.27

Formula of bayes theorem

Bayes theorem allows us to find the inverse conditional probability. Given the probability of A occurring, Bayes theorem helps us find the probability of A occurring. The formula is:

P(B|A) = [P(A|B) * P(B)] / P(A)

Where:

  • P(B|A) is the probability of B given A (which is what we are trying to find).
  • P(A|B) is the probability of A given B (known).
  • P(B) is the probability (prior probability) of B.
  • P(A) is the probability (total probability) of A.

Example 1: Medical examination

You are a doctor testing for a disease. The test is 90% accurate, meaning that if someone has the disease, the test will be positive in 90% of cases. However, the disease is rare and affects 1 in 1000 people.

What is the probability that a person who tests positive actually has the disease?

Let us analyse this:

P(Disease | Positive Test) = ?
P(Positive Test | Disease) = 0.9 (90% accuracy)
P(Disease) = 0.001 (1 in 1000 people)
P(Positive Test) = P(Positive Test | Disease) * P(Disease) + P(Positive Test | No Disease) * P(No Disease)
                                 = (0.9 * 0.001) + (Positive rate for healthy * Probability of healthy)

To proceed further, further assumptions or additional data are needed, but the basic formula shows how Bayes' theorem is structured Source Probability Flaw Accounting for common outcome scenarios.

Example 2: Lottery game

Suppose you are playing a lottery with a probability of winning of 0.05, and if you win, the probability of getting a red ticket among the winners is 0.8. If you don't win, the probability of getting a red ticket is 0.2. You get a red ticket; what is your probability of winning?

P(win | red) = ?
given:
P(red | win) = 0.8
P(win) = 0.05
P(red) = P(red | win) * P(win) + P(red | no win) * P(no win)
       = (0.8 * 0.05) + (0.2 * 0.95)
Winner (0.05) Non-winners (0.95) Red Ticket (0.8) No red ticket (0.2) Red Ticket (0.2) Not Red Ticket (0.8)

Deep insights of bayes' theorem

Bayes' theorem not only adjusts probabilities mathematically, but also speaks to the philosophy of distributing belief based on evidence. It is a key component of Bayesian inference, which is used to continually update the probability of a hypothesis as new data or evidence becomes available.

Example 3: Weather forecast

Suppose you are analyzing weather data and want to predict whether it will rain or not. Let's consider:

P(rain | clouds) = ?
Known:
P(it will be cloudy | it will rain) = 0.7 (70% chance of it being cloudy while it will rain)
P(rain) = 0.2 (normal probability of rain 20%)
P(It will be cloudy | It will not rain) = 0.4

Total probability:

P(clouds) = P(clouds | rain) * P(rain) + P(clouds | no rain) * P(no rain)
          = 0.7 * 0.2 + 0.4 * 0.8

Bayes theorem in daily life

Even though it may seem complicated, Bayes' theorem can be applied in everyday decisions. For example, diagnosing car problems based on certain symptoms, establishing fault in legal scenarios based on evidence, or determining the probability of interests based on past behavior.

Example 4: Email filtering

Spam filters use Bayes' theorem to determine whether an email is spam based on the words in it. If we know the probability of certain words (e.g., "win", "prize") appearing in spam, Bayes' theorem helps improve the accuracy of spam filters by estimating the probability of an email being spam based on its content.

suppose:
P(spam | "win") = ?
P("win" | spam) known (e.g., 0.8)
P(spam) known (for example, the normal probability of an email being spam is 0.3)
Probability P("win") that any email will be "win"

Conclusion

Bayes' theorem is a cornerstone of modern probability theory and statistics, providing powerful tools with applications ranging from predictive modeling to philosophical interpretations of probability. As learners and practitioners, it enables us to sharpen our understanding, advancing both historical perspective and future innovations.

Practicing with Bayes' Theorem enhances logical reasoning and decision-making abilities, appreciates perspective under uncertainty, and refines data interpretation as highlighted through these fascinating examples.


Grade 11 → 6.1.4


U
username
0%
completed in Grade 11


Comments