3. In deterministic processes, the outcome can be
predicted exactly in advance
• Eg. Force = mass x acceleration. If we are given values
for mass and acceleration, we exactly know the value of
force
In random processes, the outcome is not known
exactly, but we can still describe the probability
distribution of possible outcomes
• Eg. 10 coin tosses: we don’t know exactly how many
heads we will get, but we can calculate the probability
of getting a certain number of heads
June 3, 2008 3Stat 111 - Lecture 6 - Probability
4. …a random experiment is an action or process that
leads to one of several possible outcomes. For example:
6.4
Experiment Outcomes
Flip a coin Heads, Tails
Exam Marks Numbers: 0, 1, 2, ..., 100
Assembly Time t > 0 seconds
Course Grades F, D, C, B, A, A+
5. List the outcomes of a random experiment…
List: “Called the Sample Space”
Outcomes: “Called the Simple Events”
This list must be exhaustive, i.e. ALL possible
outcomes included.
Die roll {1,2,3,4,5} Die roll {1,2,3,4,5,6}
The list must be mutually exclusive, i.e. no two
outcomes can occur at the same time:
Die roll {odd number or even number}
Die roll{ number less than 4 or even number}
6.5
6. A list of exhaustive [don’t leave anything out] and mutually
exclusive outcomes [impossible for 2 different events to
occur in the same experiment] is called a sample space
and is denoted by S.
The outcomes are denoted by O1, O2, …, Ok
Using notation from set theory, we can represent the
sample space and its outcomes as:
S = {O1, O2, …, Ok}
6.6
7. Given a sample space S = {O1, O2, …, Ok}, the
probabilities assigned to the outcome must satisfy
these requirements:
(1) The probability of any outcome is between 0 and 1
i.e. 0 ≤ P(Oi) ≤ 1 for each i, and
(2) The sum of the probabilities of all the outcomes
equals 1
i.e. P(O1) + P(O2) + … + P(Ok) = 1
6.7
P(Oi) represents the probability of outcome i
8. The probability of an event is the sum of the
probabilities of the simple events that constitute the
event.
E.g. (assuming a fair die) S = {1, 2, 3, 4, 5, 6} and
P(1) = P(2) = P(3) = P(4) = P(5) = P(6) = 1/6
Then:
P(EVEN) = P(2) + P(4) + P(6) = 1/6 + 1/6 + 1/6 = 3/6 =
1/2
6.8
9. Conditional probability is used to determine how
two events are related; that is, we can determine the
probability of one event given the occurrence of
another related event.
Experiment: random select one student in class.
P(randomly selected student is male) =
P(randomly selected student is male/student is on 3rd
row) =
Conditional probabilities are written as P(A | B) and
read as “the probability of A given B” and is calculated
as:
6.9
10. Again, the probability of an event given that another event
has occurred is called a conditional probability…
P( A and B) = P(A)*P(B/A) = P(B)*P(A/B) both are true
Keep this in mind!
6.10
11. An event is an outcome or a set of outcomes of a
random process
Example: Tossing a coin three times
Event A = getting exactly two heads = {HTH, HHT, THH}
Example: Picking real number X between 1 and 20
Event A = chosen number is at most 8.23 = {X ≤ 8.23}
Example: Tossing a fair dice
Event A = result is an even number = {2, 4, 6}
• Notation: P(A) = Probability of event A
• Probability Rule 1:
0 ≤ P(A) ≤ 1 for any event A
June 3, 2008 11Stat 111 - Lecture 6 - Probability
12. Events A and B are independent if knowing that A
occurs does not affect the probability that B occurs
Example: tossing two coins
Event A = first coin is a head
Event B = second coin is a head
Disjoint events cannot be independent!
• If A and B can not occur together (disjoint), then knowing that A
occurs does change probability that B occurs
Probability Rule 5: If A and B are independent
P(A and B) = P(A) x P(B)
June 3, 2008 12Stat 111 - Lecture 6 - Probability
Independent
multiplication rule for independent events
13. One of the objectives of calculating conditional probability
is to determine whether two events are related.
In particular, we would like to know whether they are
independent, that is, if the probability of one event is not
affected by the occurrence of the other event.
Two events A and B are said to be independent if
P(A|B) = P(A)
and
P(B|A) = P(B)
P(you have a flat tire going home/radio quits working)
6.13
14. one event has no influence on the outcome of
another event
if events A & B are independent
then P(A&B) = P(A)*P(B)
if P(A&B) = P(A)*P(B)
then events A & B are independent
coin flipping
if P(H) = P(T) = .5 then
P(HTHTH) = P(HHHHH) =
.5*.5*.5*.5*.5 = .55 = .03
15.
16.
17. A graduate statistics course has seven male and three
female students. The professor wants to select two students
at random to help her conduct a research project. What is
the probability that the two students chosen are female?
P(F1 * F2) = ???
Let F1 represent the event that the first student is female
P(F1) = 3/10 = .30
What about the second student?
P(F2 /F1) = 2/9 = .22
P(F1 * F2) = P(F1) * P(F2 /F1) = (.30)*(.22) = 0.066
NOTE: 2 events are NOT independent.
6.17
18. The professor in Example 6.5 is unavailable. Her
replacement will teach two classes. His style is to select
one student at random and pick on him or her in the
class. What is the probability that the two students
chosen are female?
Both classes have 3 female and 7 male students.
P(F1 * F2) = P(F1) * P(F2 /F1) = P(F1) * P(F2)
= (3/10) * (3/10) = 9/100 = 0.09
NOTE: 2 events ARE independent.
6.18
19. The Addition Rule
The probability that at least one of the events A or B
will occur, P(A or B), is given by:
If events A and B are mutually exclusive, then the
addition rule is simplified to:
This simplified rule can be extended to any number of
mutually exclusive events.
)()()()()( BAPBPAPBAPBorAP
)()()()( BPAPBAPBorAP
20. If and A and B are mutually exclusive the occurrence of
one event makes the other one impossible. This means
that
P(A and B) = P(A * B) = 0
The addition rule for mutually exclusive events is
P(A or B) = P(A) + P(B)
Only if A and B are Mutually Exclusive.
6.20
21. In a large city, two newspapers are published, the Sun and
the Post. The circulation departments report that 22% of
the city’s households have a subscription to the Sun and
35% subscribe to the Post. A survey reveals that 6% of all
households subscribe to both newspapers. What
proportion of the city’s households subscribe to either
newspaper?
That is, what is the probability of selecting a household at
random that subscribes to the Sun or the Post or both?
P(Sun or Post) = P(Sun) + P(Post) – P(Sun and Post)
= .22 + .35 – .06 = .51
6.21
22. Bayes’ Law is named for Thomas Bayes, an eighteenth
century mathematician.
In its most basic form, if we know P(B | A),
we can apply Bayes’ Law to determine P(A | B)
6.22
P(B|A) P(A|B)
for example …
23.
24. Conditional Probability
Discrete Random Variables
Continuous Random Variables
Properties of Random Variables
• Means of Random Variables
• Variances of Random Variables
June 4, 2008 24Stat 111 - Lecture 6 - Random Variables
25. Conditional Probability
Conditional probability is the probability of an event
occurring, given that another event has already
occurred.
Conditional probability restricts the sample space.
The conditional probability of event B occurring, given
that event A has occurred, is denoted by P(B|A) and is
read as “probability of B, given A.”
We use conditional probability when two events
occurring in sequence are not independent. In other
words, the fact that the first event (event A) has occurred
affects the probability that the second event (event B)
will occur.
26. A random variable is a numerical outcome of a
random process or random event
Example: three tosses of a coin
• S = {HHH,THH,HTH,HHT,HTT,THT,TTH,TTT}
• Random variable X = number of observed tails
• Possible values for X = {0,1, 2, 3}
• Why do we need random variables?
• We use them as a model for our observed data
June 4, 2008 26Stat 111 - Lecture 6 - Random Variables
27. A random variable is a real valued function whose
domain is the sample space of a random experiment
Let us consider the experiment of tossing a coin two
times in succession.
The sample space of the experiment is S = {HH, HT,
TH, TT}.
28. A discrete random variable has a finite or
countable number of distinct values
Discrete random variables can be summarized by
listing all values along with the probabilities
• Called a probability distribution
Example: number of members in US families
June 4, 2008 28Stat 111 - Lecture 6 - Random Variables
X 2 3 4 5 6 7
P(X) 0.413 0.236 0.211 0.090 0.032 0.018
29. Continuous random variables have a non-
countable number of values
Can’t list the entire probability distribution, so we
use a density curve instead of a histogram
Eg. Normal density curve:
June 4, 2008 29Stat 111 - Lecture 6 - Random Variables
30. Average of all possible values of a random
variable (often called expected value)
Notation: don’t want to confuse random
variables with our collected data variables
= mean of random variable
x = mean of a data variable
For continuous r.v, we again need integration to
calculate the mean
For discrete r.v., we can calculate the mean by
hand since we can list all probabilities
June 4, 2008 30Stat 111 - Lecture 6 - Random Variables
31.
32. Mean is the sum of all possible values, with
each value weighted by its probability:
μ = Σ xi*P(xi) = x1*P(x1) + … + x12*P(x12)
Example: X = sum of two dice
μ = 2⋅ (1/36) + 3⋅ (2/36) + 4 ⋅ (3/36) +…+12⋅ (1/36)
= 252/36 = 7
June 4, 2008 32Stat 111 - Lecture 6 - Random Variables
X 2 3 4 5 6 7 8 9 10 11 12
P(X) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36
33. Spread of all possible values of a random
variable around its mean
Again, we don’t want to confuse random
variables with our collected data variables:
2 = variance of random variable
s2 = variance of a data variable
For continuous r.v, again need integration to
calculate the variance
For discrete r.v., can calculate the variance by
hand since we can list all probabilities
June 4, 2008 33Stat 111 - Lecture 6 - Random Variables
34.
35.
36.
37. Variance is the sum of the squared deviations away
from the mean of all possible values, weighted by
the values probability:
μ = Σ(xi-μ)*P(xi) = (x1-μ)*P(x1) + … + (x12-μ)*P(x12)
Example: X = sum of two dice
σ2 = (2 - 7)2⋅(1/36) + (3− 7)2⋅(2/36) +…+(12 - 7)2⋅(1/36)
= 210/36 = 5.83
June 4, 2008 37Stat 111 - Lecture 6 - Random Variables
X 2 3 4 5 6 7 8 9 10 11 12
P(X) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36
38. a tossed coin shows a‘head’ or ‘tail’,
a manufactured item can be ‘defective’ or ‘non-
defective’,
the responseit is customary to call one of the outcomes
a ‘success’and the other ‘not success’
Each time we toss a coin or roll a die. we call it a trial.
two outcomes usually
referred as ‘success’ or ‘failure’ are called Bernoulli
trials.
Trials of a random experiment are called Bernoulli
trials, if they satisfy