Title: Binomial Distribution
1Binomial Distribution Bayes Theorem
2Questions
- What is a probability?
- What is the probability of obtaining 2 heads in 4
coin tosses? What is the probability of
obtaining 2 or more heads in 4 coin tosses? - Give an concrete illustration of p(DH) and
p(HD). Why might these be different?
3Probability of Binary Events
- Probability of success p
- p(success) p
- Probability of failure q
- p(failure) q
- pq 1
- q 1-p
- Probability long run relative frequency
4Permutations Combinations 1
- Suppose we flip a coin 2 times
- H H
- H T
- T H
- T T
- Sample space shows 4 possible outcomes or
sequences. Each sequence is a permutation.
Order matters. - There are 2 ways to get a total of one heads (HT
and TH). These are combinations. Order does NOT
matter.
5Perm Comb 2
- HH, HT, TH, TT
- Suppose our interest is Heads. If the coin is
fair, p(Heads) .5 q 1-p .5. - The probability of any permutation for 2 trials
is ΒΌ pp, or pq, or qp, or qq. All
permutations are equally probable. - The probability of exactly 1 head in any order
is 2/4 .5 HTTH/(HHHTTHTT) what is
probability of at least 1 head?
6Perm Comb 3
- 3 flips
- HHH,
- HHT, HTH, THH
- HTT, THT, TTH
- TTT
- All permutations equally likely ppp .53
.125 1/8. - p(1 Head) 3/8
7Perm Comb 4
- Factorials N!
- 4! 4321
- 3! 321
- Combinations NCr
- The number of ways of selecting r combinations
of N objects, regardless of order. Say 2 heads
from 5 trials.
8Binomial Distribution 1
- Is a binomial distribution with parameters N and
p. N is the number of trials, p is the
probability of success. - Suppose we flip a fair coin 5 times p q .5
9Binomial 2
5 .03125
4 .15625
3 .3125
2 .3125
1 .15625
0 .03125
10Binomial 3
- Flip coins and compare observed to expected
frequencies
11Binomial 4
- Find expected frequencies for number of 1s from a
6-sided die in five rolls.
12Binomial 5
- When p is .5, as N increases, the binomial
approximates the Normal.
13Review
- What is a probability?
- What is the probability of obtaining 2 heads in 4
coin tosses? What is the probability of
obtaining 2 or more heads in 4 coin tosses?
14Bayes Theorem (1)
Bayesian statistics are about the revision of
belief. Bayesian statisticians look into
statistically optimal ways of combining new
information with old beliefs.
Prior probability personal belief or data.
Input. Likelihood likelihood of data given
hypothesis. Posterior probability probability
of hypothesis given data.
Scientists are interested in substantive
hypotheses, e.g., does Nicorette help people stop
smoking. The p level that comes from the study
is the probability of the sample data given the
hypothesis, not the probability of the hypothesis
given the data. That is
15Bayes Theorem (2)
Bayes theorem is old and mathematically correct.
But its use is controversial. Suppose you have a
hunch about the null (H0) and the alternative
(H1) that specifies the probability of each
before you do a study. The probabilities p(H0)
and p(H1) are priors. The likelihoods are p(y
H0) and p(y H1). Standard p values. The
posterior is given by
p(H1y)1-p(H0y)
16Bayes Theorem (3)
Suppose before a study is done that the two
hypotheses are H0 p .80 and H1 p.40 for the
proportion of male grad students. Before the
study, we figure that the probability is .75 that
H0 is true and .25 That H1 is true. We grab 10
grad students at random and find that 6 of 10 are
male. Binomial applies.
17Bayes Theorem (4)
Bayes theorem says we should revise our belief of
the probability that H0 is true from .75 to .70
based on new data. Small change here, but can be
quite large depending on data and prior.
Problems with choice of prior. Handled by
empirical data or by flat priors. There are
Bayesian applications to more complicated
situations (e.g., means and correlations). Not
used much in psychology yet except in
meta-analysis (empricial Bayes estimates) and
judgment studies (Taxis, etc). Rules for
exchangeability (admissible data) need to be
worked out.
18Review
Give an concrete illustration of p(DH) and
p(HD). Why might these be different?