4 - Basic Probability Continued

Alex John Quijano

09/22/2021

Previously on Probability…


Last session, we discussed about:

Basic Probability Continued


In this lecture, we will learn about,

Random Variable

A Random Variable (R.V.) is a type of variable where the value is a function that associates a numerical value with a potential outcome.

Tossing a Fair Coin ONE time


\(P(Y = 1) = \frac{1}{2} \longrightarrow\) the probability of landing a head

\(P(Y = 0) = \frac{1}{2} \longrightarrow\) the probability of landing a tail

Tossing a Fair Coin THREE Times

Discrete R.V. vs Continuous R.V.

Discrete R.V. vs Continuous R.V.

Discrete R.V. corresponds to
Probability Mass Functions (PMF)

Example:

Continuous R.V. corresponds to
Probability Density Functions (PDF)

Example:

Tossing a Fair Coin Once


Tossing a Fair Coin \(\mathbf{n}\) Times

Tossing a Fair Coin \(\mathbf{n}\) Times

Bernoulli’s Beau

Tossing a Fair Coin \(\mathbf{n = 20}\) Times

Using the binomial PMF, we can answer probability questions much more easily.

Tossing a Fair Coin \(\mathbf{n = 20}\) Times

Using the binomial PMF, we can answer probability questions much more easily.

The probability of observing exacty 7 heads is \[P(X = 7) = 0.074\]

Tossing a Fair Coin \(\mathbf{n = 20}\) Times

Using the binomial PMF, we can answer probability questions much more easily.

The probability of observing at least 7 heads is \[P(X \ge 7) = \sum_{x=7}^{20} P(X = x) = 0.942\]

Note: We are adding the probabilities because each event is disjoint.

Tossing a Fair Coin \(\mathbf{n = 20}\) Times

Using the binomial PMF, we can answer probability questions much more easily.

The probability of observing at most 7 heads is \[P(X \le 7) = \sum_{x=0}^{7} P(X = x) = 0.132\]

Note: We are adding the probabilities because each event is disjoint.

Tossing Coins Simulations

Tossing Coins Simulations

Source Code: R - Law of Large Numbers Simulations
If a fair coin is tossed a large number of times, the number of heads and the number of tails should be approximately equal. This is called the law of large numbers.

The Law of Large Numbers



The law of large numbers is a theorem that describes the result of performing the same trials multiple times. As more trials are performed, the mean of the results obtained tends to become closer to the expected value. We will go back to this idea again in a few weeks.

Summary

In this lecture, we talked about the following:

In the next lecture, we will talk about,

Today’s Activity

Work within your groups to discuss the answers of the following problems.

Part A: Consider rolling a six-sided dice two times.

  1. How many possible events are there? Make a table of possible events of rolling a two-sided dice twice and make another table that sums the numbers of each possible event.

  2. What is the probability of rolling a (3,4) pair - in order?

  3. What is the probability of rolling a sum of 8 or a sum of 6?

Part B: Consider binomial experiment where observing a six is a success (not observing a six is a failure). Suppose that the number of rolls is 3.

  1. What is the probability of observing a six at least 2 times? You can use the dbinom function in R.

Part C: The Binomial distribution simulates the number of successes in \(n\) independent trials, each of which is a “success” with probability \(p\) and a “failure” with probability \(1-p\). Discuss real-life applications of the binomial experiment.