Alex John Quijano
10/29/2021
Independence vs Mutual Exclusivity
Examples of condition probability using balls in an urn
Sampling with/without replacement
Geometric and Binomial PMFs
Independence vs Dependence - conditional probability notations and examples
Bayes’ theorem introduction
\[ \begin{align} P(\text{1 roll until 1st 6}) & = \frac{1}{6} \\ P(\text{2 rolls until 1st 6}) & = \frac{5}{6}\frac{1}{6} = \frac{5}{6^2} \\ & \vdots \\ P(\text{n rolls until 1st 6}) & = \frac{5^{n-1}}{6^n} \end{align} \]
The probability mass function (PMF) for the geometric distribution is \[P(X = n) = \frac{5^{n-1}}{6^n} = \left(\frac{5}{6}\right)^{n-1} \left(\frac{1}{6}\right)\] where \(X\) is a random variable of the number of “failures” in a sequence of Bernoulli trials up to the first “success”.
In general, the Geometric PMF is \[P(X = n; p) = \left(1-p\right)^{n-1} p\] where \(p\) is the probability of “success”.
Random variable \(X\) with Geometric PMF \[P(X = n; p) = \left(1-p\right)^{n-1} p\] where \(n\) is the “failures” up to the first “success” with probability \(p\).
Random variable \(X\) with Binomial PMF \[P(X = x; n,p) = {n \choose x} p^x (1-p)^{n-x}\] where \(x\) is the “successes” in \(n\) independent trials with the probability of “success” \(p\).
Random variable \(X\) with Geometric PMF \[E[X] = \frac{1}{p}\]
Random variable \(X\) with Binomial PMF \[E[X] = np\]
In your homework, you will do some detailed calculations on showing the expected value for the geometric distribution.
We will revisit these functions again next week.
We start with an urn filled with 5 red and 5 black balls, randomly mixed. I draw three balls from the urn, sampling without replacement. What is the probability that at least one is black?
Let \(A\) = “Draw at least 1 black”, then \(A^C\) = “Draw 0 black”. We know \(P(A) + P(A^C) = 1\). We know \(A^C\) has only one outcome: red, red, red.
\[P(A^C) = P(red,red,red) = \left(\frac{5}{10}\right) \left(\frac{4}{9}\right) \left(\frac{3}{8}\right) = \frac{1}{12}\]
Notation: Let \(R_i\) be the red outcome on the \(i\)th draw and let \(P(R_{i+1} | R_i)\) be the conditional probability of the (\(i+1\))th roll given the \(i\)th roll has already occurred.
\[ \begin{align} P(R_1,R_2,R_3) & = P(R_1)P(R_2|R_1)P(R_3|R_2 \cap R_1) \\ & = \left(\frac{5}{10}\right) \left(\frac{4}{9}\right) \left(\frac{3}{8}\right) \\ & = \frac{1}{12} \end{align} \]
\[P(A^C) = P(R_1,R_2,R_3) = \frac{1}{12}\]
\[P(A) = 1 - P(A^C) = 1 – \frac{1}{12} = \frac{11}{12}\]
We start with an urn filled with 5 red and 5 black balls, randomly mixed. I draw three balls from the urn, sampling without replacement. What is the probability that at least one is black?
Recall that we solved this problem using a trick \(P(A) + P(A^C) = 1\).
Now, solve this in a more detailed way. Use the probability tree that I showed on the board to solve \(P(A)\) without using \(P(A) + P(A^C) = 1\).
15:15
Let \(R_i\) be the red outcome at the \(i\)th draw and \(B_i\) be the black outcome at the \(i\)th draw. All possible outcomes - or the events space - are listed below: \[ \begin{align} \color{red}{\{R_1,R_2,R_3\}} & \color{blue}{\{B_1,R_2,R_3\}} \\ \color{blue}{\{R_1,R_2,B_3\}} & \color{blue}{\{B_1,R_2,B_3\}} \\ \color{blue}{\{R_1,B_2,R_3\}} & \color{blue}{\{B_1,B_2,R_3\}} \\ \color{blue}{\{R_1,B_2,B_3\}} & \color{blue}{\{B_1,B_2,B_3\}} \\ \end{align} \]
From our definition of events: \[A = \color{blue}{\{\cdots\}} \longrightarrow \text{at least one black and } A^C = \color{red}{\{\cdots\}} \longrightarrow \text{all red}\]
\[ \begin{align} P(A^C) & = P(R_1)P(R_2|R_1)P(R_3|R_1 \cap R_2) & \longrightarrow \color{red}{\{R_1,R_2,R_3\}} \end{align} \]
\[ \begin{align} P(A) & = P(R_1)P(R_2|R_1)P(B_3|R_1 \cap R_2) & \longrightarrow \color{blue}{\{R_1,R_2,B_3\}} \\ & + P(R_1)P(B_2|R_1)P(R_3|R_1 \cap B_2) & \longrightarrow \color{blue}{\{R_1,B_2,R_3\}} \\ & \vdots & \\ & + P(B_1)P(B_2|B_1)P(B_3|B_1 \cap B_2) & \longrightarrow \color{blue}{\{B_1,B_2,B_3\}} \\ P(A) & = \frac{11}{12} \end{align} \]
Let \(B_i\) are all possible outcomes \(i=1,\cdots,n\). \[P(A) = \sum_{i=1}^{n} P(A \cap B_i) = \sum_{i=1}^{n} P(A|B_i)P(B_i)\]
We also have seen the complement rule. \[P(A) + P(A^C) = 1\]
This notation is slightly different from our previous example. We will see more of this law next week
For events \(A\) and \(B\), the notation \(P(A | B)\) with symbol “\(|\)” is the probability of \(A\) given \(B\) has already occurred. \[P(A|B) = \frac{P(A \cap B)}{P(B)}, \text{ where } P(B) \ne 0\] \[P(B|A) = \frac{P(A \cap B)}{P(A)}, \text{ where } P(A) \ne 0\]
Conditional probability are not always commutative. \[P(A|B) \ne P(B|A)\]
Solving for \(P(A \cap B)\) in \(P(B|A)\), \[P(A \cap B) = P(B|A)P(A)\]
Which yields the Bayes’ theorem \[P(A|B) = \frac{P(B|A)P(A)}{P(B)}, \text{ where } P(B) \ne 0\]
You can also consider \(P(B|A)\) using the above terms.
Today, we discussed the following:
Law of total probability
Introduction to Bayes’ theorem
Next, we will discuss:
More on Bayes’ theorem
Examples using Bayes’ theorem