Title: Bivariate Populations
1Bivariate Populations
2Todays Plan
- Bivariate populations and conditional
probabilities - Joint and marginal probabilities
- Bayes Theorem
3A Simple E.C.P Example
- Introduce Bivariate probability with an example
of empirical classical probability (ecp). - Consider a fictitious computer company. We might
ask the following questions - What is the probability that consumers will
actually buy a new computer? - What is the probability that consumers are
planning to buy a new computer? - What is the probability that consumers are
planning to buy and actually will buy a new
computer? - Given that a consumer is planning to buy, what is
the probability of a purchase?
4A Simple E.C.P Example(2)
- Think of probability as relating to the outcome
of a random event (recap) - All probabilities fall between 0 and 1
certain
null
- Probability of any event A is
Where m is the number of events A and n is the
number of possible events
5A Simple E.C.P Example(3)
- The cumulative frequency is
- The sample space (of a 1000 obs) looks like this
- Before we move on well look at some simple
definitions
6A Simple E.C.P Example(4)
- If we have an event A there will be a compliment
to A which well call A or B - Well start computing marginal probabilities
- Event A consists of two outcomes, a1 and a2
- The compliment B consists also of two outcomes,
b1 and b2
- two events are mutually exclusive if both events
cannot occur - A set of events is collectively exhaustive if one
of the events must occur
7A Simple E.C.P Example(5)
- Computing marginal probabilities
- Where k is some arbitrary large number
- If A planned to purchase and Bactually
purchased - P(planned to buy) P(planned did) P(planned
did not)
8A Simple E.C.P Example(6)
- If the two events, A and B, are mutually
exclusive, then
- General rule written as
- Example Probability that you draw a heart or
spade from a deck of cards - Theyre mutually exclusive events
- P(Heart or Spade) P(Heart) P(Spade) P(Heart
Spade)
9A Simple E.C.P Example(6)
- Probability that someone planned to buy or
actually did buy use the general addition rule
- If A is planning to purchase, and B is actually
purchasing, we can plug in the marginal
probabilities to find
Joint Probability P(A and B) Planned and
Actually Purchased
10Conditional Probabilities
- Lets leave the example for a while and consider
conditional probabilities. - Conditional probabilities are represented as
P(YX) - This looks similar to the conditional mean
function
- Well use this to lead into regression line
inference, and then well look at Bayes theorem
11Conditional Probabilities (2)
- Probabilities will be defined as
- If we sum over j and k, we will get 1, or
- We define the conditional probability as f (XY)
- This is read a function of X given Y
- We can define this as
12Conditional Probabilities (3)
- Similarly we can define f (YX)
- Looking at our example spreadsheet, we have a
sample of weekly earnings and years of education
L5_1.XLS.
- There are two statements on the spreadsheet that
will clarify the difference between a joint and
conditional probabilities
13Conditional Probabilities (4)
- The joint probability is a relative frequency and
it asks - How many people earn between 600 and 799 and
have 10 years of education? - The conditional probability asks
- How many people earn between 600 and 799 given
they have 10 years of education? - On the spreadsheet Ive outlined the cells that
contain the highest probability in each completed
years of education - Theres a pattern you should notice
14Conditional Probabilities (5)
- We can use the same data to graph the conditional
mean function - the graph shows the same pattern we saw in the
outlined cells - The conditional probability table gives us a
small distribution around each year of education
15Conditional Probabilities (6)
- To summarize, conditional probabilities can be
written as
- This is read as The probability of X given Y
- For example The probability that someone earns
between 200 and 300, given that he/she has
completed 10 years of education - Joint probabilities are written as P(XY)
- This is read as the probability of X and Y
- For example The probability that someone earns
between 200 and 300 and has 10 years of
education
16A Marketing Example
- Now well look at joint probabilities again using
the marketing example from earlier in the
lecture.
- We will look at
- Marginal probabilities P(A) or P(B)
- Joint probabilities P(AB)
- Conditional probabilities
17Marketing Example(2)
- Lets look at the probability you purchased a
computer given that you planned to purchase
- The joint probability that you purchased and
planned to purchase 200/1000 .2 20
18Marketing Example (3)
- We can also represent this in a decision tree
19Statistical Independence
- Two events exhibit statistical independence if
- P(AB) P(A)
- We can change our marketing matrix to create a
situation of statistical independence
Note all we did was change the joint
probabilities
20Sampling w/ and w/o Replacement
- How would sampling with and without replacement
change our probabilities? - If we have 20 markers (14 blue and 6 red)
- Whats the probability that we pick a red pen?
- P(BR)6/20
- If we replace the pen after every draw, whats
the probability that we pick red twice in a row? - (6/20)(6/20)36/400 .09 9
- Whats the probability of drawing two reds in a
row if we dont replace after each draw? - (6/20)(5/19) 30/380 .079 7.9
21Bayes Theorem
- With decision trees we had to know the
probabilities of each event beforehand - Using Bayes we can update using complement
probabilities - Consider the multiplication of independent
events
- The marginal probability rule says
22Bayes Theorem (2)
- Because of independence we can write P(A) another
way
- We can now write our conditional probability
function as
- Plugging in our expression for P(A) gives us
Bayes Theorem
23Bayes Theorem (3)
- Think of the Bayes Theorem as probability in
reverse - You can update your probabilities in light of new
information - Suppose you have a product with a known
probability of success - P(success) P(S) 0.4
- P(failure) P(S) 0.6
- We also know that a consumer group will write
either a favorable or unfavorable report on the
product - P(FS) 0.8 P(FS) 0.3
24Bayes Theorem (4)
- Given our information, we want to find the
probability that the product will be successful
given a favorable report - P(SF)
- In this case, Bayes says
- We can plug values into the above equation to
find
- We can use the theorem to update the probability
of a successful product given that the product
gets a favorable report
25Recap
- Weve seen how we can calculate marginal, joint,
and conditional probabilities - Computer company example
- Spreadsheet L5_1.XLS
- We talked about statistical independence
- Weve seen how Bayes Theorem allows us to update
our priors