Title: Applications of Approximate Reasoning in Decision Analysis
1Applications of Approximate Reasoning in Decision
Analysis
- Terry Bott and Steve Eisenhawer
- Probabilistic Risk and Hazard Analysis, D-11
- Decision Applications Division
- Los Alamos National Laboratory
- Workshop on Novel Approaches to Uncertainty
Quantification - Los Alamos, New Mexico
- February 28, 2002
LA-UR-02-1628
2LANL Better descriptions, title
Sample of Problems Addressed During Development
of Methodology
? Risk of Sabotage to Space Shuttle (1986) ?
State of Knowledge for Paths to Inadvertent
Nuclear Detonation ? Rank HE Aging Paths
Requiring HE Replacement ? Risk of Information
Loss During Foreign Visits ? Prioritize Spare
Parts Procurement ? Likelihood Ranking of Item
Loss Scenarios (2002)
3Elements of the Decision Process
- Determine possibilities
- Select metric to rank the possibilities
- Design an inferential model for the metric
- Rank the possibilities
- Express uncertainty in the results
- Make results useful to the customer
Our approach is based upon the theory of
approximate reasoning. We call our version of AR
Logic-evolved Decision Models LED
4LANL Check colors, look at animation, mention
uncertainty
5LANL Can just say this one
Identifying Possibilities
Sponsor Supplied ? Well understood problem ?
Limited Scope of Interest
Logically Evolved ? Possibilities developed
deductively ? Process aided by Visual interface
with equations ? Simlutaneous Set of
Logic Equations Produced ? Solution is a
Disjoint Set of Paths, Implicants, etc.
6Problem Statement
An item turns up missing in an inventory.
Initial investigation produces conflicting and
confusing evidence. Management wishes to narrow
down the possible loss scenarios if possible,
identify actions to reduce the likelihood of
future events and conduct a damage assessment.
7We interpreted these needs as What can be
inferred in a short time and with limited effort
about the relative likelihood of possible loss
scenarios given a limited amount and quality of
evidence ?
8LANL Check colors, look at animation, mention
uncertainty
9Something is missing...
Its somebody elses Its the one without a
barcode Its lost but theres nothing important
on it
What is it?
Theres something important on it. Its
compromised
10Thrown out and retrieved Targeted
collection Opportunistic collection
No Compromise of information
11The underlying logical equation that describes
the possibilities...
The tree is a GUI to deduce this equation...
12Solution of the logical equation yields multiple
paths. These paths are the loss scenarios.
UNACCOUNTEDFCeog, ITEMISIDENTIFIEDag,
FCCAMERAbe, UNMARKEDbe, BARCODEASSIGNEDbe,
BARCODEMISSINGog, BARCODECAMEOFFbe
UNACCOUNTEDFCeog, CLASSIFIEDLOSSca,
CLASSIFIEDFCag, FLASHCARDCLASSog,
OWNERCLASSIFIEDbe, PROVENANCEog,
OLYMPUSFLASHCARDog, LEGACYog, UAPLACEDINSAFEbe,
LOSSPROCESSog, WITHCOMPROMISEog,
LLWASTERETREIVALag, INWASTEog, SWEPTUPBYCLEANERag,
DROPPEDONFLOORbe, UNMARKEDbe, BARCODEINEFFECTog,
BARCODEMISSINGog, BCNOTATTACHEDbe, PUTINTRASHbe,
THROWNOUTINLOCbe, LLWASTEACCESSbe,
LLWASTERETREIVEDbe
How should we rank order them?
13Solution of the logical equation yields multiple
paths. These paths are the loss scenarios.
UNACCOUNTEDFCeog, ITEMISIDENTIFIEDag,
FCCAMERAbe, UNMARKEDbe, BARCODEASSIGNEDbe,
BARCODEMISSINGog, BARCODECAMEOFFbe
Top Event Connective Element Active
Element
UNACCOUNTEDFCeog, CLASSIFIEDLOSSca,
CLASSIFIEDFCag, FLASHCARDCLASSog,
OWNERCLASSIFIEDbe, PROVENANCEog,
OLYMPUSFLASHCARDog, LEGACYog, UAPLACEDINSAFEbe,
LOSSPROCESSog, WITHCOMPROMISEog,
LLWASTERETREIVALag, INWASTEog, SWEPTUPBYCLEANERag,
DROPPEDONFLOORbe, UNMARKEDbe, BARCODEINEFFECTog,
BARCODEMISSINGog, BCNOTATTACHEDbe, PUTINTRASHbe,
THROWNOUTINLOCbe, LLWASTEACCESSbe,
LLWASTERETREIVEDbe
We use the active elements as the primary inputs
to our inferential model.
14LANL Check colors, look at animation, mention
uncertainty
15LANL Just say it
Determining Precision
Precision should be based on ? Customers
Needs ? Available Knowledge ? Available time
and resources
Our rule is to ultimately convert precise into
imprecise knowledge in mixed precision situations.
16Inference Precision
Precision describes how knowledge is expressed.
Numerical knowledge is precise, linguistic
knowledge is imprecise.
17Inferential Evaluation
Factors Linguistic variables that influence an
inference
Combinant Equations How factors combine to
produce an inference
Descriptors Linguistic values for each factor
Implicant Equations Rules for combining
linguistic values to infer resultant linguistic
values. This includes simple Modus Ponens as
well as more complex rules.
18Linked Inferences
(Ai?Bj)?(Ak?Bm) ?Cn
(Fz?Cn)
?Ga
? (Fz?Cp)
Dx?Ey?Fz
G?N?O
(Ai?Bj) ?Cp
J?M?N
19LANL This needs to be clarified
Types of Uncertainty We Consider
Set Assignment Uncertainty Ambiguity Linguistic
variables are not precise Set Assignment
Informant is uncertain which linguistic value is
best
Outcome Uncertainty Probability appropriate for
describing uncertainty in precise
knowledge Possibility appropriate for describing
uncertainty in imprecise knowledge
LANLWere setup to discuss inference and
uncertainty together in the rest of the example
20Supporting Evidence
Aggregate Evidence
Posterior Likelihood
Rule 2
Rule 1
LANL Likelihood- imprecise outcome
possibility Evidence - imprecise assignment and
more...
Contradictory Evidence
Prior Likelihood
21Individual Element Likelihood
Scenario Likelihood
22Scenario Likelihood Change
Scenario Prior Likelihood
Scenario Posterior Likelihood
23LANL Intro to look in more detail
24Aggregate Evidence
Rule 2
25Rule 2
ESmax None, Weakly Supporting,Strongly
Supporting ECmax None, Weakly
Contradictory,Strongly Contradictory Eagg
None, Strongly Contradictory, Weakly
Contradictory, Conflicting, Weakly
Supporting,Strongly Supporting
IF ESmax is Weakly Supporting and ECmax is
Strongly Contradictory THEN Eagg is Weakly
Contradictory
LANL Evaluation with min-max, more on this later
26LANL Rulebase here
L E(Pr) Very Low, Low, High Eagg None,
Strongly Contradictory, Weakly Contradictory,
Conflicting, Weakly Supporting,Strongly
Supporting L E(Po) Very Low, Low, High, Very
High
The rule base shifts the posterior based upon the
specific, forensic evidence available
27LANL How do we deduce the model form?
28(No Transcript)
29Element Posterior Likelihood High
A single entry in the rulebase
All of the information in the inferential model
is represented in this logical equation...
30How does this work in practice?
31LANL Add the actual data and discuss for one
element
Inferring Element Posterior Likelihood
UNACCOUNTEDFCeog, ITEMISIDENTIFIEDag,
FCCAMERAbe, UNMARKEDbe, BARCODEASSIGNEDbe,
BARCODEMISSINGog, BARCODECAMEOFFbe
32Inferring Scenario Posterior Likelihood
LANL Do the scenario likelihood the same way
Show the posterior results Mention or slide on
defuzzification
33Inferring Likelihood Change
LANL Results here
34LANL How do we deduce the model form?
35LANL FMM here
Solving AR Inference Models Quickly The Fast
Min-Max Procedure
Digraph of the Inferential logical equation
36LANL FMM here
To Rule 1
Subsets for Element Likelihood In Rule 1
Rule 2
Prior Likelihood
Aggregate Evidence
37The FMM Algorithm
- Solve the inference equation for the implicant
chains linking the active elements and the output
metric linguistic subsets - For each subset do the min on each of the
associated implicant chains - Do the max over all of the results of the min
operation
This procedure is of O10 times faster per node
relative to the SMM. Since it is only performed
once per evaluation it is of O10N times faster
per scenario evaluation. This is typically of
O100 - O1000
38Conclusions
We have continuously encountered problems whose
solution entailed ? Identifying an exhaustive
set of possibilities ? Ranking the possibilities
according to some metric(s) When ? All or
critical parts of knowledge base are imprecise ?
Uncertainty is imprecise as well ? Knowledge
base is scattered among informants
Using a logic model to generate possibilities and
an inference engine, capturing uncertainty using
AR has proven to be a flexible and ffficient
analysis technique