type
status
date
slug
summary
tags
category
password
icon
Author
Abstract
Bayes’ theorem is a fundamental concept in probability theory that describes how to update our beliefs about an event based on new evidence. It's a powerful tool used in various fields such as medicine, statistics, machine learning, and modern AI.
In this blog, I'll provide a brief introduction to Bayes’ theorem, focusing on its key formulas.

1-The Basics of Probability

Before diving into the theorem, let's quickly review some probability basics:
  • Event: An outcome or a set of outcomes of an experiment (e.g., flipping a coin and getting heads).
  • Probability: A measure of the likelihood of an event occurring, ranging from 0 to 1.
  • Conditional Probability: The probability of an event happening given that another event has already occurred. We denote this as , which reads "the probability of event A given event B".

2-Introducing Bayesian Theorem

Bayesian Theorem states as follows:
There are some components in this formula needed to be clarified:
  • is posterior probability - The probability of event A occurring given that event B has occurred (what we want to find)
  • is likelihood - The probability of event B occurring given that event A has occurred.
  • is prior probability - The initial probability of event A occurring(our belief before considering new evidence)
  • is marginal likelihood - The probability of event B occurring.

3-An example to illustrate

Let’s say we want to determine the probability of a person having a disease (event A) given that they tested positive for it(event B).
  • Prior : Let’s assume the disease is rare, with a prevalance of 1% in the population. So, .
  • Likelihood : Let’s say the test is accurate, with 95% a chance of a positive result if the person has the disease. So,
  • Marginal Likelihood: This is slightly tricker. We need to consider the probability of testing positive whether the person has the disease or not. We can calculate it using the law of total probability.
Here,   means not . So according to and , we can get that and ;Now, we get :
  • Posterior : Now, we can apply Bayesian Theorem:
    <ins/>

    4-Conditional Probability on Multiple Events

    When dealing with more than two events, we enter the realm of multivariate Bayesian probability. The notation represents the probability of event occurring given that both event and have occurred.
    The formula for the conditional probability of given and is:
    where is the joint probability of occurring together, and is the joint probability of occurring.
    We can also express this using the chain rule of probability:

    5-Conclusion

    In this brief blog post, I introduce the renowned Bayes’ Theorem. However, it's important to note that in advanced statistical research, the theorem's applications are far more complex. This post serves as a starting point, offering a glimpse into the fundamental concept while acknowledging that there is much more to explore and understand in the field of Bayesian statistics.
    Sep 19,Notes on Qwen2.5 Sep 13, Notes on OpenAI o1 series models
    Loading...