Title: MBA 7020 Business Analysis Foundations Decision Tree
1MBA 7020Business Analysis Foundations
Decision Tree Bayes Theorem July 25, 2005
2Agenda
Bayes Theorem
Problems
3Decision Trees
- A method of visually structuring the problem
- Effective for sequential decision problems
- Two types of branches
- Decision nodes
- Choice nodes
- Terminal points
- Solving the tree involves pruning all but the
best decisions - Completed tree forms a decision rule
4Decision Nodes
- Decision nodes are represented by Squares
- Each branch refers to an Alternative Action
- The expected return (ER) for the branch is
- The payoff if it is a terminal node, or
- The ER of the following node
- The ER of a decision node is the alternative with
the maximum ER
5Chance Nodes
- Chance nodes are represented by Circles
- Each branch refers to a State of Nature
- The expected return (ER) for the branch is
- The payoff if it is a terminal node, or
- The ER of the following node
- The ER of a chance node is the sum of the
probability weighted ERs of the branches - ER ? P(Si) Vi
6Terminal Nodes
- Terminal nodes are optionally represented by
Triangles - The node refers to a payoff
- The value for the node is the payoff
7Problem 1
- Jenny Lind is a writer of romance novels. A movie
company and a TV network both want exclusive
rights to one of her more popular works. If she
signs with the network, she will receive a single
lump sum, but if she signs with the movie company
the amount she will receive depends on the market
response to her movie. - Jenny Lind Potential Payouts
- Movie company
- Small box office - 200,000
- Medium box office - 1,000,000
- Large box office - 3,000,000
- TV Network
- Flat rate - 900,000
- Questions
- How can we represent this problem?
- What decision criterion should we use?
8Jenny Lind Payoff Table
9Jenny Lind Decision Tree
10Problem 2 Solving the Tree
- Start at terminal node at the end and work
backward - Using the ER calculation for decision nodes,
prune branches (alternative actions) that are not
the maximum ER - When completed, the remaining branches will form
the sequential decision rules for the problem
11Jenny Lind Decision Tree (Solved)
12 Decision Tree Activation Test Source Delta
Airlines SkyMiles Program
SkyMiles Enrollment Message A
Returned within xx days Message B
Did not return within xx days Message C
Returned within xx days
Did not return within xx days
Did not return within xx days
Graduate to SOW
If Vc xx, send Message D
If Vc lt xx, no more messages
Graduate to SOW
If Vc lt xx, no more messages
If Vc xx, send Message D
13Probability
- The Three Requirements of Probabilities
- All Probabilities must lie with the range of 0 to
1. - The sum of the individual probabilities equal to
the probability of their union - The total probability of a complete set of
outcomes must be equal to 1.
14Direct Marketing Campaign Platform
15Communication Variables
Vehicles E-mail Kits 2 Statement (
Telephone Direct Mail (USPS)
- Message / Offer (incentive)
- Hurdle (SOW)
- trip x get y
- Next trip (Re-Activation)
- Rate of trip triggers
- Points (double/flat?)
- Miles (front back-end)
- Other
- Creative Execution
- Can test several executions tailored to
clusters/segments
- Timing/Frequency
- Monthly (statements)
- Repeat/Follow-up Mailings
16Measuring Effectiveness Lift/Gains Chart
Targeting
100
90
Percent of potential responders captured
Random mailing
45
0
45
100
Percent of population targeted
17Example Direct Mail OptimizationSource
InterContinental Hotels Group Priority Club
Rewards Program
- Using multivariate model we are able to maximize
profit while minimizing costs - In comparison to methodology used last year model
savings XXX - Savings attributable to reduced mailing to
achieve last years result (variable cost
savings). - Other benefits - Customer Behavior, Planning Tool
18Agenda
Bayes Theorem
Decision Tree
Problems
19 Bayes' Theorem
- Bayes' Theorem is used to revise the probability
of a particular event happening based on the fact
that some other event had already happened. - Probabilities Involved
- P(Event)
- Prior probability of this particular situation
- P(Prediction Event)
- Predictive power (Likelihood) of the information
source - P(Prediction ? Event)
- Joint probabilities where both Prediction and
Event occur - P(Prediction)
- Marginal probability that this prediction is made
- P(Event Prediction)
- Posterior probability of Event given Prediction
20Bayes Theorem
- Bayes's Theorem begins with a statement of
knowledge prior to performing the experiment.
Usually this prior is in the form of a
probability density. It can be based on physics,
on the results of other experiments, on expert
opinion, or any other source of relevant
information. Now, it is desirable to improve this
state of knowledge, and an experiment is designed
and executed to do this. Bayes's Theorem is the
mechanism used to update the state of knowledge
to provide a posterior distribution. The
mechanics of Bayes's Theorem can sometimes be
overwhelming, but the underlying idea is very
straightforward Both the prior (often a
prediction) and the experimental results have a
joint distribution, since they are both different
views of reality.
21Bayes Theorem
- Let the experiment be A and the prediction be B.
Both have occurred, AB. The probability of both A
and B together is P(AB). The law of conditional
probability says that this probability can be
found as the product of the conditional
probability of one, given the other, times the
probability of the other. That is - P(AB) P(B) P(AB) P(BA) P(A)
- if both P(A) and P(B) are non zero.
- Simple algebra shows that
- P(BA) P(AB) P(B) / P(A) equation 1
- This is Bayes's Theorem. In words this says that
the posterior probability of B (the updated
prediction) is the product of the conditional
probability of the experiment, given the
influence of the parameters being investigated,
times the prior probability of those parameters.
(Division by the total probability of A assures
that the resulting quotient falls on the 0, 1
interval, as all probabilities must.)
22Bayes Theorem
23 Conditional Probability
24 Bayes' Theorem
25Probability Information
- Prior Probabilities
- Initial beliefs or knowledge about an event
(frequently subjective probabilities) - Likelihoods
- Conditional probabilities that summarize the
known performance characteristics of events
(frequently objective, based on relative
frequencies)
26Circumstances for using Bayes Theorem
- You have the opportunity, usually at a price, to
get additional information before you commit to a
choice - You have likelihood information that describes
how well you should expect that source of
information to perform - You wish to revise your prior probabilities
27Problem
- A company is planning to market a new product.
The companys marketing vice-president is
particularly concerned about the products
superiority over the closest competitive product,
which is sold by another company. The marketing
vice-president assessed the probability of the
new products superiority to be 0.7. This
executive then ordered a market survey to
determine the products superiority over the
competition. - The results of the survey indicated that the
product was superior to its competitor. - Assume the market survey has the following
reliability - If the product is really superior, the
probability that the survey will indicate
superior is 0.8. - If the product is really worse than the
competitor, the probability that the survey will
indicate superior is 0.3. - After completion of the market survey, what
should the vice-presidents revised probability
assignment to the event new product is superior
to its competitors?
28Joint Probability Table
29Agenda
Bayes Theorem
Decision Tree
Problems
30What kinds of problems?
- Alternatives known
- States of Nature and their probabilities are
known. - Payoffs computable under different possible
scenarios
31Basic Terms
- Decision Alternatives
- States of Nature (eg. Condition of economy)
- Payoffs ( outcome of a choice assuming a state
of nature) - Criteria (eg. Expected Value)
Z
32Example Problem 1- Expected Value Decision
Tree
33Expected Value
34Decision Tree
35Example Problem 2- Sequential Decisions
- Would you hire a consultant (or a psychic) to get
more info about states of nature? - How would additional info cause you to revise
your probabilities of states of nature occurring? - Draw a new tree depicting the complete problem.
- Consultants Track Record
Z
36Example Problem 2- Sequential Decisions (Ans)
Open MBA7020Joint_Probabilities_Table.xls
- First thing you want to do is get the information
(Track Record) from the Consultant in order to
make a decision. - This track record can be converted to look like
this - P(F/S1) 0.2 P(U/S1) 0.8
- P(F/S2) 0.6 P(U/S2) 0.4
- P(F/S3) 0.7 P(U/S3) 0.3
- F Favorable UUnfavorable
- Next, you take this information and apply the
prior probabilities to get the Joint Probability
Table/Bayles Theorum
Z
37Example Problem 2- Sequential Decisions (Ans)
Open MBA7020Joint_Probabilities_Table.xls
- Next step is to create the Posterior
Probabilities (You will need this information to
compute your Expected Values) - P(S1/F) 0.06/0.49 0.122
- P(S2/F) 0.36/0.49 0.735
- P(S3/F) 0.07/0.49 0.143
-
- P(S1/U) 0.24/0.51 0.47
- P(S2/U) 0.24/0.51 0.47
- P(S3/U) 0.03/0.51 0.06
- Solve the decision tree using the posterior
probabilities just computed.
Z