FROM SEARCH ENGINES TO QUESTIONANSWERING SYSTEMS - PowerPoint PPT Presentation

1 / 132
About This Presentation
Title:

FROM SEARCH ENGINES TO QUESTIONANSWERING SYSTEMS

Description:

methods based on bivalent logic and probability theory do not provide a ... in bivalent logic, every proposition is either true or false, with no shades of gray ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:3.0/5.0
Slides: 133
Provided by: fla980
Category:

less

Transcript and Presenter's Notes

Title: FROM SEARCH ENGINES TO QUESTIONANSWERING SYSTEMS


1
FROM SEARCH ENGINES TO QUESTION-ANSWERING
SYSTEMS NEED FOR NEW TOOLS Lotfi A. Zadeh
Computer Science Division Department of
EECSUC Berkeley URL http//www-bisc.cs.berkeley
.edu URL http//zadeh.cs.berkeley.edu/ Email
Zadeh_at_cs.berkeley.edu
2
BACKDROP
3
BASIC GOAL
CONCEPTION, DESIGN AND IMPLEMENTATION OF
INTELLIGENT DECISION SYSTEMS
4
BASIC STRUCTURE
acquisition of information
communication of information
processing of information (extracting decision-rel
evant information)
decision
execution
assessment
5
INTELLIGENT DECISION SYSTEM
INFORMATION-ON-DEMAND MODULE
INFORMATION PROFERRAL MODULE
INFORMATION ALERT MODULE
INFORMATION-ON-DEMAND MODULEQ/A SYSTEM Q/A
SYSTEMSEARCH ENGINE DEDUCTION MODULE
6
ADDING DEDUCTION CAPABILITY TO SEARCH ENGINES
  • deduction capability ability to provide an
    answer to a query by drawing on information which
    resides in various parts of the knowledge base
  • existing search engines have many remarkable
    capabilities but deduction capability is not
    among them
  • adding deduction capability to search engines is
    an open-ended, complex problem
  • incremental progress is achievable through use of
    existing methods
  • substantial progress is beyond the reach of
    methods based on bivalent logic and probability
    theory

7
ADDING SUBSTANTIAL DEDUCTION CAPABILITY TO SEARCH
ENGINES IS BEYOND THE REACH OF METHODS BASED ON
BIVALENT LOGIC AND PROBABILITY THEORY
  • principal reason much of the information in the
    knowledge-base of a search engine is
    perception-based
  • perceptions are intrinsically imprecise
  • perceptions are f-granular
  • methods based on bivalent logic and probability
    theory do not provide a machinery for processing
    perception-based information

8
TEST QUERIES
  • How many Ph.D.s in Computer Science were
    produced by European universities in 1996?
  • Name of the President of Finland?
  • Telephone numbers of the President of Finland?
  • Name of the king of Finland?
  • How many horses received the Ph.D. degree in
    1996?
  • How many lakes are there in Finland?

9
NEW TOOLS
computing with words
computing with numbers



CW
CN
IA
GrC
PNL
precisiated natural language
computing with granules
computing with intervals
PTp
CTP
THD
CTP computational theory of
perceptions PTp perception-based
probability theory THD theory of hierarchical
definability
  • a granule is defined
  • by a generalized
  • constraint

10
COMPUTATIONAL THEORY OF PERCEPTIONS (CTP) PREAMBLE
  • It is a deep-seated tradition in science to
    equate scientific progress to progression from
    perceptions to measurements
  • But, what humans haveand machines have not is a
    remarkable capability to perform a wide variety
    of physical and mental tasks without any
    measurements and any computations. A canonical
    example of this capability is driving in heavy
    city traffic. Another example is summarizing a
    book.

11
CONTINUED
  • To endow machines with this capability it is
    necessary to progress, countertraditionally, from
    measurements to perceptions
  • This is the objective of the computational theory
    of perceptions (CTP)a theory in which
    perceptions are objects of computation

12
FROM MEASUREMENTS TO PERCEPTIONS
perception
WINE EXPERT
assessment
sample
excellent
wine
chemical analysis
NN
excellent
perception
measurements
neural network
fuzzy
crisp input
  • NN is a neurofuzzy neural network
  • with crisp input and fuzzy output

13
COMPUTATIONAL THEORY OF PERCEPTIONS
  • the point of departure in the computational
    theory of perceptions is the assumption that
    perceptions are described by propositions
    expressed in a natural language
  • examples
  • economy is improving
  • Robert is very honest
  • it is not likely to rain tomorrow
  • it is very warm
  • traffic is heavy
  • in general, perceptions are summaries
  • perceptions are intrinsically imprecise

14
CONTINUED
  • imprecision of perceptions is a manifestation of
    the bounded ability of sensory organs and,
    ultimately, the brain, to resolve detail and
    store information
  • perceptions are f-granular in the sense that (a)
    the boundaries of perceived classes are fuzzy
    and (b) the values of perceived attributes are
    granular, with a granule being a clump of values
    drawn together by indistinguishability,
    similarity, proximity or functionality
  • it is not possible to construct a computational
    theory of perceptions within the conceptual
    structure of bivalent logic and probability theory

15
EXAMPLES OF F-GRANULATION (LINGUISTIC VARIABLES)
color red, blue, green, yellow, age young,
middle-aged, old, very old size small, big, very
big, distance near, far, very, not very far,
?
young
middle-aged
old
1
0
age
100
  • humans have a remarkable capability to perform a
    wide variety of physical and mental tasks, e.g.,
    driving a car in city traffic, without any
    measurements and any computations
  • one of the principal aims of CTP is to develop a
    better understanding of how this capability can
    be added to machines

16
MEASUREMENT-BASED VS. PERCEPTION-BASED INFORMATION
INFORMATION
measurement-based numerical
perception-based linguistic
  • it is 35 C
  • Eva is 28
  • Tandy is three years
  • older than Dana
  • It is very warm
  • Eva is young
  • Tandy is a few
  • years older than Dana
  • it is cloudy
  • traffic is heavy
  • Robert is very honest

17
MEASUREMENT-BASED
PERCEPTION-BASED (version 1)
  • a box contains 20 black and white balls
  • over seventy percent are black
  • there are three times as many black balls as
    white balls
  • what is the number of white balls?
  • what is the probability that a ball picked at
    random is white?
  • a box contains about 20 black and white balls
  • most are black
  • there are several times as many black balls as
    white balls
  • what is the number of white balls
  • what is the probability that a ball drawn at
    random is white?

18
COMPUTATION (version 1)
  • measurement-based
  • X number of black balls
  • Y2 number of white balls
  • X ? 0.7 20 14
  • X Y 20
  • X 3Y
  • X 15 Y 5
  • p 5/20 .25
  • perception-based
  • X number of black balls
  • Y number of white balls
  • X most 20
  • X several Y
  • X, Y 20
  • P Y/N

19
PERCEPTION OF A FUNCTION
Y
Y
f
L
granulation
M
S
0
X
S
M
L
0
perception
if X is small then Y is small if X is
medium then Y is large if X is large then Y
is small
f f
Y
medium x large
f (fuzzy graph)
0
X
20
PERCEPTION-BASED GRANULAR PROBABILITY
DISTRIBUTION
probability
P3
P2
P1
X
0
A2
A1
A3
P(AT) Pi(1)\A1 Pi(2)\A2 Pi(3)\A3 Prob
AT is Ai is Pj(i)
21
BASIC STRUCTURE OF PROBABILITY THEORY
PROBABILITY THEORY
measurement- based
perception- based
frequestist objective
bivalent-logic- based
fuzzy-logic- based
Bayesian subjective
PTp
PT
generalization
  • In PTp everything is or is allowed to be
    perception-based

22
FUNDAMENTAL POINTS
  • the point of departure in perception-based
    probability theory (PTp) is the postulate
  • subjective probabilityperception of likelihood
  • perception of likelihood is similar to
    perceptions of time, distance, speed, weight,
    age, taste, mood, resemblance and other
    attributes of physical and mental objects
  • perceptions are intrinsically imprecise,
    reflecting the bounded ability of sensory organs
    and, ultimately, the brain, to resolve detail and
    store information
  • perceptions and subjective probabilities are
    f-granular

23
COMPUTING WITH WORDS (CW)
  • in computing with words, the objects of
    computation are words and propositions in a
    natural language
  • example a box contains N balls of various sizes
  • a few are small
  • most are medium
  • a few are large
  • how many are neither small nor large
  • example A is near B
  • B is near C
  • how far is A from C

24
KEY POINT
  • words are less precise than numbers
  • computing with words (CW) is less precise than
    computing with numbers (CN)
  • CW serves two major purposes
  • provides a machinery for dealing with problems in
    which precise information is not available
  • provides a machinery for dealing with problems in
    which precise information is available, but there
    is a tolerance for imprecision which can be
    exploited to achieve tractability, robustness,
    simplicity and low solution cost

25
CW AND PNL
  • a concept which plays a central role in CW is
    that of PNL (Precisiated Natural Language)
  • basically, a natural language, NL, is a system
    for describing perceptions
  • perceptions are intrinsically imprecise
  • imprecision of natural languages is a reflection
    of the imprecision of perceptions
  • the primary function of PNL is that of serving as
    a part of NL which admits precisiation
  • PNL has a much higher expressive power than any
    language that is based on bivalent logic

26
COMPUTING WITH WORDS AND PERCEPTIONS (CWP)
Key points
  • In computing with words and perceptions, the
    objects of computation are words, propositions,
    and perceptions described in a natural language
  • A natural language is a system for describing
    perceptions
  • In CWP, a perception is equated to its
    description in a natural language

27
CONTINUED
  • in science, it is a deep-seated tradition to
    strive for the ultimate in rigor and precision
  • words are less precise than numbers
  • why and where, then, would words be used in
    preference to numbers?

28
CONTINUED
  • One of the major aims of CWP is to serve as a
    basis for equipping machines with a capability to
    operate on perception-based information. A key
    idea in CWP is that of dealing with perceptions
    through their descriptions in a natural language.
    In this way, computing and reasoning with
    perceptions is reduced to operating on
    propositions drawn from a natural language.

29
CONTINUED
  • when the available information is not precise
    enough to justify the use of numbers
  • when precision carries a cost and there is a
    tolerance for imprecision which can be exploited
    to achieve tractability, robustness and reduced
    cost
  • when the expressive power of words is greater
    than the expressive power of numbers

30
BASIC PERCEPTIONS
attributes of physical objects
  • distance
  • time
  • speed
  • direction
  • length
  • width
  • area
  • volume
  • weight
  • height
  • size
  • temperature

sensations and emotions
  • color
  • smell
  • pain
  • hunger
  • thirst
  • cold
  • joy
  • anger
  • fear

concepts
  • count
  • similarity
  • cluster
  • causality
  • relevance
  • risk
  • truth
  • likelihood
  • possibility

31
DEEP STRUCTURE OF PERCEPTIONS
  • perception of likelihood
  • perception of truth (compatibility)
  • perception of possibility (ease of attainment or
    realization)
  • perception of similarity
  • perception of count (absolute or relative)
  • perception of causality

subjective probability quantification of
perception of likelihood
32
KEY POINTS
  • decisions are based on information
  • in most realistic settings, decision-relevant
    information is a mixture of measurements and
    perceptions
  • examples buying a house buying a stock
  • existing methods of decision analysis are
    measurement-based and do not provide effective
    tools for dealing with perception-based
    information
  • a decision is strongly influenced by the
    perception of likelihoods of outcomes of a choice
    of action

33
KEY POINTS
  • in most realistic settings
  • the outcomes of a decision cannot be predicted
    with certainty
  • decision-relevant probability distributions are
    f-granular
  • decision-relevant events, functions and relations
    are f-granular
  • perception-based probability theory, PTp, is
    basically a calculus of f-granular probability
    distributions, f-granular events, f-granular
    functions, f-granular relations and f-granular
    counts

34
CONTINUED
  • In CWP, what is employed for this purpose is PNL
    (Precisiated Natural Language.) In PNL, a
    proposition, p, drawn from a natural language,
    NL, is represented as a generalized constraint,
    with the language of generalized constraints,
    GCL, serving as a precisiation language for
    computation and reasoning, PNL is equipped with
    two dictionaries and a modular multiagent
    deduction database. The rules of deduction are
    expressed in what is referred to as the Protoform
    Language (PFL).

35
TEST PROBLEM
  • A function, Yf(X), is defined by its fuzzy graph
    expressed as
  • f1 if X is small then Y is small
  • if X is medium then Y is large
  • if X is large then Y is small
  • (a) what is the value of Y if X is not large?
  • (b) what is the maximum value of Y

Y
M L
L
M
S
X
0
S
M
L
36
BALLS-IN-BOX EXAMPLE (version 2)
  • a box contains about N balls of various sizes
  • most are large
  • there are many more large balls than small balls
  • what is the number of small balls?
  • what is the probability that a ball drawn at
    random is neither large nor small

37
MEASUREMENT-BASED VS. PERCEPTION-BASED CONCEPTS
measurement-based perception-based expected
value usual value stationarity regularity con
tinuous smooth Example of a regular
process T travel time from home to office
on day i.
38
THERE IS A FUNDAMENTAL CONFLICT BETWEEN BIVALENCE
AND REALITY
  • we live in a world in which almost everything is
    a matter of degree
  • but
  • in bivalent logic, every proposition is either
    true or false, with no shades of gray allowed
  • in fuzzy logic, everything is or is allowed to be
    a matter of degree
  • in bivalent-logic-based probability theory only
    certainty is a matter of degree
  • in perception-based probability theory,
    everything is or is allowed to be a matter of
    degree

39
BASIC PERCEPTIONS / F-GRANULARITY
  • temperature warmcoldvery warmmuch warmer
  • time soon about one hour not much later
  • distance near far much farther
  • speed fast slow much faster
  • length long short very long

?
small
medium
large
1
0
size
40
CONTINUED
  • similarity low medium high
  • possibility low medium high almost
    impossible
  • likelihood likely unlikely very likely
  • truth (compatibility) true quite true very
    untrue
  • count many few most about 5 (5)
  • subjective probability perception of likelihood

41
CONTINUED
  • function if X is small then Y is large
  • (X is small, Y is large)
  • probability distribution low \ small low \
    medium high \ large
  • Count \ attribute value distribution 5 \ small
    8 \ large
  • PRINCIPAL RATIONALES FOR F-GRANULATION
  • detail not known
  • detail not needed
  • detail not wanted

42
PRECISIATED NATURAL LANGUAGE
PNL
43
GENERALIZED CONSTRAINT
  • standard constraint X ? C
  • generalized constraint X isr R

X isr R
copula
GC-form (generalized constraint form of type r)
type identifier
constraining relation
constrained variable
  • X (X1 , , Xn )
  • X may have a structure XLocation
    (Residence(Carol))
  • X may be a function of another variable Xf(Y)
  • X may be conditioned (X/Y)

44
WHAT IS PRECISIATED NATURAL LANGUAGE (PNL)?
PRELIMINARIES
  • a proposition, p, in a natural language, NL, is
    precisiable if it translatable into a
    precisiation language
  • in the case of PNL, the precisiation language is
    the Generalized Constraint Language, GCL
  • precisiation of p, p, is an element of GCL
    (GC-form)

45
THE CONCEPT OF PRECISIABILITY
NL (natural language)
PL (precisiable language)
p
p
translation
precisiation
translate of p precisiation of p
proposition
  • p is precisiable w/r to PL p is translutable into
    PL
  • criterion of precisiability p is an object of
    computation
  • PL propositional logic
  • predicate logic
  • modal logic
  • Prolog
  • LISP
  • Generalized Constraint Language (GCL) p
    GC-form

46
EXAMPLES
  • PL propositional logic
  • Robert is taller than Alan taller (Robert, Alan)
  • Height (Robert)gtHeight (Alan)
  • PL first-order predicate logic
  • all men are mortal
  • most Swedes are tall not precisiable
  • PL PNL
  • most Swedes are tall
  • is most
  • principal distinguishing features of PNL are
  • PL GCL (Generalized Constraint Language)
  • DL (Deduction Logic) FL (fuzzy logic)
  • PNL is maximally expressive

47
THE CONCEPT OF A GENERALIZED CONSTRAINT (1985)
GC-form
X isr R
granular value of X
constraining relation
modal variable (defines modality)
constrained variable
  • principal modalities
  • possibilistic (r blank) X is R ,
    Rpossibility distribution of X
  • probabilistic (r p) X isp R
    Rprobability distribution of X
  • veristic (r v) X isv R
    Rverity (truth) distribution of X
  • usuality (ru) X isu R Rusual value
    of X
  • random set (rrs) X isrs R
    Rfuzzy-set-valued distribution of X
  • fuzzy graph (rfg) X isfg Rfuzzy
    graph of X
  • bimodal (rbm) X isbm R Rbimodal
    distribution of X
  • Pawlak set (rps) X isps R upper and
    lower approximation to X

48
CONSTRAINT QUALIFICATION
  • constraint qualification (X isr R) is q
  • q
  • example (X is small) is unlikely

qualifier
possibility
probability
verity (truth)
49
GENERALIZED CONSTRAINT LANGUAGE (GCL)
  • GCL is generated by combination, qualification
    and propagation of generalized constraints
  • in GCL, rules of deduction are the rules
    governing generalized constraint propagation
  • examples of elements of GCL
  • (X isp R) and (X,Y) is S)
  • (X isr R) is unlikely) and (X iss S) is likely
  • if X is small then Y is large
  • the language of fuzzy if-then rules is a
    sublanguage of PNL

50
WHAT IS PNL?
  • PNL is a sublanguage of precisiable propositions
    in NL which is equipped with two dictionaries
    (1) NL to GCL (2) GCL to PFL (Protoform
    Language) and (3) a modular multiagent database
    of rules of deduction (rules of generalized
    constrained propagation) expressed in PFL.

51
INFORMATION PRINCIPAL MODALITIES
  • possibilistic r blank
  • X is R (R possibility distribution of X)
  • probabilistic r p
  • X isp R (R probability distribution of X)
  • veristic r v
  • X isv R (R verity (truth) distribution of X)
  • if r is not specified, default mode is
    possibilistic

52
EXAMPLES (POSSIBILISTIC)
  • Eva is young Age (Eva) is young
  • Eva is much younger than Maria
  • (Age (Eva), Age (Maria)) is much younger
  • most Swedes are tall
  • ?Count (tall.Swedes/Swedes) is m or t

X
R
R
X
R
X
53
EXAMPLES (PROBABILISITIC)
  • X is a normally distributed random variable with
    mean m and variance ?2
  • X isp N(m, ?2)
  • X is a random variable taking the values u1, u2,
    u3 with probabilities p1, p2 and p3, respectively
  • X isp (p1\u1p2\u2p3\u3)

54
EXAMPLES (VERISTIC)
  • Robert is half German, quarter French and quarter
    Italian
  • Ethnicity (Robert) isv (0.5German
    0.25French 0.25Italian)
  • Robert resided in London from 1985 to 1990
  • Reside (Robert, London) isv 1985, 1990

55
THE BASIC IDEA
P
GCL
NL
precisiation
description
p
NL(p)
GC(p)
description of perception
precisiation of perception
perception
PFL
GCL
abstraction
GC(p)
PF(p)
precisiation of perception
GCL (Generalized Constrain Language) is maximally
expressive
56
DICTIONARIES
1
precisiation
proposition in NL
p
p (GC-form)
? Count (tall.Swedes/Swedes) is most
most Swedes are tall
2
protoform
precisiation
PF(p)
p (GC-form)
? Count (tall.Swedes/Swedes) is most
Q As are Bs
57
THE CONCEPT OF PROTOFORM
KEY POINTS
  • protoform abbreviation of prototypical form
  • PF(p) protoform of p
  • PF(p) deep semantic structure of p
  • PF(p) abstraction of precisiation of p
  • abstraction is a form of summarization
  • if p has a logical form, LF(p), then PF(p) is an
    abstraction of LF(p)
  • all men are mortal ?x(man(x)
    mortal(x)) ?x(A(x) B(x))

LF
PF
58
CONTINUED
  • if p does not have a logical form but has a
    generalized constraint form, GC(p), then PF(p) is
    an abstraction of GC(p)

most Swedes are tall is most
GC(p)
QAs are Bs
PF(p)
59
CONTINUED
  • abstraction has levels, just as summarization
    does
  • p and q are PF-equivalent at level ? if at level
    of abstraction ?, PF(p)PF(q)

NL
p q
most Swedes are tall QAs are Bs
a few professors are rich QAs are Bs
60
BASIC STRUCTURE OF PNL
NL
PFL
GCL
p


p
p
precisiation
GC(p)
PF(p)
precisiation (a)
abstraction (b)
DDB
deduction database
  • In PNL, deductiongeneralized constraint
    propagation
  • DDB deduction databasecollection of rules
    governing generalized constraint propagation
  • DDB rules are protoformal

61
EXAMPLE OF TRANSLATION
  • P usually Robert returns from work at about 6 pm
  • P Prob (Time(Return(Robert)) is 6 pm is
    usually
  • PF(p) Prob X is A is B
  • X Time (Return (Robert))
  • A 6 pm
  • B usually
  • p ? NL
  • p ? GCL
  • PF(p) ? PFL

62
BASIC STRUCTURE OF PNL
DICTIONARY 1
DICTIONARY 2
GCL
PFL
NL
GCL
p
GC(p)
GC(p)
PF(p)
MODULAR DEDUCTION DATABASE
POSSIBILITY MODULE
PROBABILITY MODULE
agent
EXTENSION PRINCIPLE MODULE
RANDOM SET MODULE
63
INTERPOLATION OF BIMODAL DISTRIBUTIONS
P
g(u) probability density of X
p2
p
p1
pn
X
0
A1
A2
A
An
pi is Pi granular value of pi , i1, , n (Pi ,
Ai) , i1, , n are given A is given (?P, A)
64
CONTINUED
  • Prob X is Ai is Pj(i), i1, , m , j1, ,
    n
  • g(u)du1
  • G is small ?u(g(u) is small)

Prob X is A is ?v
g(u)?Ai(u)du
Prob X is Ai
construct
65
INTERPOLATION
is ?A
subject to
66
CONTINUED
  • (g) possibility distribution of g
  • ?(g)

extension principle
?(g) ?(f(g))
?(v) supg(?(g))
subject to
v f(g)
67
EXPECTED VALUE
is ?A
subject to
68
USUALITY SUBMODULE
69
USUALITY QUALIFIED RULES
X isu A X isun (not A)
X isu A Yf(X) Y isu f(A)
70
USUALITY QUALIFIED RULES
X isu A Y isu B Z f(X,Y) Z isu f(A, B)
71
PNL AS A DEFINITION LANGUAGE
72
DEFINITION OF OPTIMALITYOPTIMIZATIONMAXIMIZATION
?
gain
gain
yes
unsure
0
0
X
a
a
b
X
gain
gain
no
hard to tell
0
0
a
b
X
a
b
c
X
  • definition of optimal X requires use of PNL

73
MAXIMUM ?
Y
  • ?x (f (x)? f(a))
  • (?x (f (x) gt f(a))

f
m
0
X
a
Y
extension principle
Y
Pareto maximum
f
f
0
X
0
X
b) (?x (f (x) dominates f(a))
74
MAXIMUM ?
Y
f (x) is A
0
X
Y
f
f ?i Ai ? Bi f if X is Ai then Y is Bi, i1,
, n
Bi
0
X
Ai
75
DEFINITION OF p ABOUT 20-25 MINUTES
?
1
c-definition
0
20
25
time
?
1
f-definition
0
20
25
time
?
1
f.g-definition
0
20
25
time
P
PNL-definition
Prob (Time is A) is B
B
6
time
A
76
EXAMPLE
PNL definition of about 20 to 25 minutes
Prob getting to the airport in less than about
25 min is unlikely Prob getting to the airport
in about 20 to 25 min is likely Prob getting
to the airport in more than 25 min is unlikely
P
granular probability distribution
likely
unlikely
Time
20
25
77
PNL-BASED DEFINITION OF STATISTICAL INDEPENDENCE
Y
contingency table
?C(M/L)
L
L/M
L/L
L/S
3
M
?C(S/S)
M/M
M/S
M/L
2
S
X
S/S
S/M
S/L
1
0
1
2
3
S
M
L
?C (M x L)
? (M/L)
?C (L)
  • degree of independence of Y from X
  • degree to which columns 1, 2, 3 are identical

PNL-based definition
78
PROTOFORM LANGUAGE
PFL
79
WHAT IS A PROTOFORM
  • pproposition in a natural language
  • if p has a logical form, LF(p), then a protoform
    of p, PF(p), is an abstraction of LF(p)
  • all men are mortal ?x(man(x) mortal(x))
    ?x(A(x) B(x))

p
LF(p)
PF(p)
abstractiondeinstantiation
abstraction
all men are mortal all men are A
deinstantiation
80
CONTINUED
  • if p does not have a logical form but is in PNL,
    then a protoform of p is an abstraction
    (deinstantiation) of the generalized constraint
    form of p, GC(p)

most Swedes are tall SCount(tall.Swedes/Swedes)
is most
p
GC(p)
abstraction
QAs are Bs
PF(p)
81
PROTOFORM AND PF-EQUIVALENCE
knowledge base (KB)
PF-equivalence class (P)
P
protoform (p) Q As are Bs
p
most Swedes are tall
q
few professors are rich
  • P is the class of PF-equivalent propositions
  • P does not have a prototype
  • P has an abstracted prototype Q As are Bs
  • P is the set of all propositions whose protoform
    is Q As are Bs

82
REASONING WITH PERCEPTIONS DEDUCTION MODULE
initial data set
initial generalized constraint set
IDS
IGCS
perceptions p
GC-forms GC(p)
translation
explicitation precisiation
IGCS
IPS
initial protoform set
GC-form GC(p)
protoforms PF(p)
abstraction
deinstantiation
TPS
TDS
IPS
terminal data set
terminal protoform set
initial protoform set
goal-directed
deinstantiation
deduction
83
COUNT-AND MEASURE-RELATED RULES
?
Q
crisp
1
ant (Q)
Q As are Bs ant (Q) As are not Bs
r
0
1
?
Q As are Bs Q1/2 As are 2Bs
1
Q
Q1/2
r
0
1
most Swedes are tall ave (height) Swedes is ?h
Q As are Bs ave (BA) is ?C
,
84
CONTINUED
not(QAs are Bs) (not Q) As are Bs
Q1 As are Bs Q2 (AB)s are Cs Q1 Q2
As are (BC)s
Q1 As are Bs Q2 As are Cs (Q1 Q2 -1)
As are (BC)s
85
PROTOFORMAL CONSTRAINT PROPAGATION
p
GC(p)
PF(p)
Age (Dana) is young
Dana is young
X is A
Age (Tandy) is (Age (Dana))
Tandy is a few years older than Dana
Y is (XB)
few
X is A Y is (XB) Y is AB
Age (Tandy) is (youngfew)
86
THE ROBERT EXAMPLE
87
THE ROBERT EXAMPLE
  • the Robert example relates to everyday
    commonsense reasoning a kind of reasoning which
    is preponderantly perception-based
  • the Robert example is intended to serve as a test
    of the deductive capability of a reasoning system
    to operate on perception-based information

88
THE ROBERT EXAMPLE
  • the Robert example is a sequence of versions of
    increasing complexity in which what varies is the
    initial data-set (IDS)
  • version 1
  • IDS usually Robert returns from work at about 6
    pm
  • questions
  • q1 what is the probability that Robert is
    home at t (about t pm)?
  • q2 what is the earliest time at which the
    probability that Robert is home is high?

89
CONTINUED
  • version 2
  • IDS usually Robert leaves office at about
    530pm, and usually it takes about 30min to get
    home
  • q1, q2 same as in version 1
  • version 3 this version is similar to version 2
    except that travel time depends on the time of
    departure from office.
  • q1, q2 same as version 1

90
THE ROBERT EXAMPLE (VERSION 3)
  • IDS Robert leaves office between 515pm and
    545pm. When the time of departure is about
    520pm, the travel time is usually about 20min
    when the time of departure is about 530pm, the
    travel time is usually about 30min when the time
    of departure is about 540pm, the travel time is
    about 20min
  • usually Robert leaves office at about 530pm
  • What is the probability that Robert is home at
    about t pm?

91
THE ROBERT EXAMPLE
Version 4
  • Usually Robert returns from work at about 6 pm
  • Usually Ann returns from work about
    half-an-hour later
  • What is the probability that both Robert and
    Ann are
  • home at about t pm?

Ann
P
Robert
1
0
time
600
t
92
THE ROBERT EXAMPLE
Version 1. My perception is that Robert
usually returns from work at about
600pm q1 What is the probability that
Robert is home at about t pm? q2 What
is the earliest time at which the probability
that Robert is home is high?
93
PROTOFORMAL DEDUCTION
THE ROBERT EXAMPLE
  • IDS p usually Robert returns from work at about
    6 pm.
  • TDS q what is the probability that Robert is
    home at
  • about t pm?
  • precisiation
  • p Prob Time (Robert returns from work is
  • about 6 pm is usually
  • q Prob Time (Robert is home) is about t pm
  • is ?D
  • calibration µusually , µt , t about t
  • abstraction
  • p Prob X is A is B
  • q Prob Y is C is ?D

94
CONTINUED
4. search in Probability module for applicable
rules
Prob X is A is B Prob Y is C is D
not found
Prob X is A is B Prob X is C is D
Prob X is A is B Prob f(X) is C is D
found
5. back to IDS and TDS event equivalence Robert
is home at t Robert returns from work before t
95
THE ROBERT EXAMPLE
event equivalence
Robert is home at about t pm Robert returns from
work before about t pm
?
before t
1
t (about t pm)
0
time
T
t
time of return
Before about t pm o about t pm
96
CONTINUED
6. back to Probability module
Prob X is A is B Prob X is C is D
7. Instantiation D Prob Robert is home at
about 615 X Time (Robert returns from
work) A 6 B usually C ? 615
97
DEDUCTION (COMPUTING) WITH PERCEPTIONS
deduction
p1 p2 pn
pn1
example
Dana is young Tandy is a few years older than
Dana Tandy is (youngfew)
deduction with perceptions involves the use of
protoformal rules of generalized constraint
propagation
98
THE CONCEPT OF PROTOFORM
  • protoformabbreviation of prototypical form

syntactic parse
syntax tree
parsing
p
semantic parse
logical form
semantic network
conceptual graph
canonical form
abstraction
abstraction
semantic parse
protoform 1
protoform 2
protoform 3
  • protoformabstracted summary

99
THE CONCEPT OF PROTOFORM
  • a protoform is an abstracted prototype of a class
    of propositions
  • examples
  • most Swedes are tall
  • many Americans are foreign-born
  • overeating causes obesity Q
    As are Bs
  • obesity is caused by overeating Q
    Bs are As

P-abstraction
Q As are Bs
P-abstraction
P-abstraction
100
WHAT IS A PROTOFORM
  • pproposition in a natural language
  • if p has a logical form, LF(p), then a protoform
    of p, PF(p), is an abstraction of LF(p)
  • all men are mortal ?x(man(x) mortal(x))
    ? x(A(x) B(x))

p
LF(p)
PF(p)
abstractiondeinstantiation
abstraction
all men are mortal all men are A
deinstantiation
101
CONTINUED
  • if p does not have a logical form but is in PNL,
    then a protoform of p is an abstraction
    (deinstantiation) of the generalized constraint
    form of p, GC(p)

most Swedes are tall SCount(tall.Swedes/Swed
es) is most
p
GC(p)
abstraction
QAs are Bs
PF(p)
102
THE CONCEPT OF PROTOFORM
NL
LOGICAL FORM
PROTOFORM
all men are mortal
Vx(man(x) mortal(x))
Vx(A(x) B(x))
most Swedes are tall
Q As are Bs
usually Robert returns from work at about 6pm
Prob (A) is B
fuzzy probability
fuzzy event
103
ORGANIZATION OF KNOWLEDGE
FDB
DDB
factual database
deduction database
fact
rule
measurement-based
knowledge
perception-based
  • much of human knowledge is perception-based
  • examples of factual knowledge
  • height of Eiffel Tower is 324 m (with antenna)
    (measurement-based)
  • Berkeley is near San Francisco (perception-based)
  • icy roads are slippery (perception-based)
  • if Marina is a student then it is likely that
    Marina is young (perception-based)

104
PROTOFORM AND PF-EQUIVALENCE
knowledge base (KB)
PF-equivalence class (P)
protoform (p) Q As are Bs
P
most Swedes are tall
q
few professors are rich
  • P is the class of PF-equivalent propositions
  • P does not have a prototype
  • P has an abstracted prototype Q As are Bs
  • P is the set of all propositions whose protoform
    is Q As are Bs

105
REASONING WITH PERCEPTIONS DEDUCTION MODULE
initial data set
initial generalized constraint set
IDS
IGCS
perceptions p
GC-forms GC(p)
translation
explicitation precisiation
IGCS
IPS
initial protoform set
GC-form GC(p)
protoforms PF(p)
abstraction
deinstantiation
TPS
TDS
IPS
terminal data set
terminal protoform set
initial protoform set
goal-directed
deinstantiation
deduction
106
DEDUCTION MODULE
  • rules of deduction are rules governing
    generalized constraint propagation
  • rules of deduction are protoformal
  • examples
  • generalized modus ponens

X is A if X is B then Y is
C Y is A (B C)
Prob (A) is B Prob (C) is D
subject to
107
EXAMPLE OF DEDUCTION
most Swedes are tall ? R Swedes are very tall
s/a-transformation
most Swedes are tall
Q As are Bs
Q As are Bs Q½ As are 2Bs
1
most
most
most Swedes are very tall
r
0
1
0.25
0.5
108
CONTINUED
not(QAs are Bs) (not Q) As are Bs
Q1 As are Bs Q2 (AB)s are Cs Q1 Q2 As
are (BC)s
Q1 As are Bs Q2 As are Cs (Q1Q2 -1) As
are (BC)s
109
INFORMAL PROTOFORM-BASED REASONING
most As are Bs X is A it is likely that X
is B
QAs are Bs X is A Prob(X is B) is Q
tacit assumptions X is picked at random from As
110
COUNT-AND MEASURE-RELATED RULES
Q
crisp
1
Q As are Bs ant (Q) As are not Bs
ant (Q)
r
0
1
Q As are Bs Q½ As are 2Bs
1
Q
Q
r
0
1
most Swedes are tall ave (height) Swedes is ?h
Q As are Bs ave (BA) is ?C
,
111
WHAT IS PRECISIATED NATURAL LANGUAGE (PNL)?
PRELIMINARIES
  • a proposition, p, in a natural language, NL, is
    precisiable if it translatable into a
    precisiation language
  • in the case of PNL, the precisiation language is
    the Generalized Constraint Language, GCL
  • precisiation of p, p, is an element of GCL

112
PRECISIATED NATURAL LANGUAGE (PNL)
NL
GCL
generalized constraint form of type r
translation
precisiable propositions in NL
p
CSNL
GC-form
precisiation language (GCL)
  • PNL is like a dictionary in which an entry, p, is
    a precisiable proposition in a natural
    language, and its meaning, p, is a precisiation
    of p

translation
p
X isr R
precisiation explicitation
generalized constraint form of type r (GC(p))
113
CONTINUED
  • PNL is
  • a dictionary of propositions in which the main
    entry is p, and its meaning is p
  • a collection of protoformal rules of deduction
  • dictionary

P
P
most Swedes are tall
?Count (tall.Swedes/Swedes) is most


114
KEY POINT
in the computational theory of perceptions
(CTP) perceptions are dealt with through their
descriptions in a natural language
perceptiondescriptor(s) of perception
  • a proposition, p, in NL qualifies to be an object
    of
  • computation if p is in PNL (Precisiated
    Natural Language)

115
AUGMENTED PNL (PNL)
  • PNL is a collection of ordered triples
  • PNL (p, p, PF(p))
  • p precisiable proposition in a natural
    language, NL
  • p precisiation of p
  • p translation of p into GCL
  • PF(p) protoform of p
  • example
  • (most Swedes are tall, ?Count (tall.Swedes/Swedes)
    is most, QAs are Bs)
  • PNL is needed for reasoning with perceptions

116
EXAMPLE
  • I am driving to the airport. How long will it
    take me to get there?
  • Hotel clerks perception-based answer about
    20-25 minutes
  • about 20-25 minutes cannot be defined in the
    language of bivalent logic and probability theory
  • To define about 20-25 minutes what is needed is
    PNL

117
DEFINITION OF p ABOUT 20-25 MINUTES
1
c-definition
0
20
25
time
1
f-definition
0
20
25
time
1
f.g-definition
0
20
25
time
P
PNL-definition
Prob (Time is A) is B
B
6
time
A
118
EXAMPLE
PNL definition of about 20 to 25 minutes
Prob getting to the airport in less than about
25 min is unlikely Prob getting to the airport
in about 20 to 25 min is likely Prob getting
to the airport in more than 25 min is unlikely
P
granular probability distribution
likely
unlikely
Time
20
25
119
THE ROBERT EXAMPLE
  • the Robert example relates to everyday
    commonsense reasoning a kind of reasoning which
    is preponderantly perception-based
  • the Robert example is intended to serve as a test
    of the deductive capability of a reasoning system
    to operate on perception-based information

120
THE ROBERT EXAMPLE
Version 1. My perception is that Robert
usually returns from work at about
600pm q1 What is the probability that
Robert is home at about t pm? q2 What
is the earliest time at which the probability
that Robert is home is high?
121
THE ROBERT EXAMPLE (VERSION 3)
  • IDS Robert leaves office between 515pm and
    545pm. When the time of departure is about
    520pm, the travel time is usually about 20min
    when the time of departure is about 530pm, the
    travel time is usually about 30min when the time
    of departure is about 540pm, the travel time is
    about 20min
  • usually Robert leaves office at about 530pm
  • What is the probability that Robert is home at
    about t pm?

122
THE ROBERT EXAMPLE
Version 4
  • Usually Robert returns from work at about 6 pm
  • Usually Ann returns from work about
    half-an-hour later
  • What is the probability that both Robert and
    Ann are
  • home at about t pm?

Robert
Ann
P
1
0
time
600
t
123
THE ROBERT EXAMPLE
event equivalence
Robert is home at about t pm Robert returns from
work before about t pm
before t
1
t (about t pm)
0
time
T
t
time of return
Before about t pm ?? about t pm
124
THE ROBERT EXAMPLE
backtracking from query to query-relevant
information
query (q) what is the probability,P, that
Robert is home at about t pm (t)?
query (q) what is the earliest time at which
the probability that Robert is home is
high?
Version 1
query-relevant information (q -1 ) probability
distribution of time, T, at which
Robert returns from work relevant fact
(f(q -1) usually Robert returns from work at
about 6pm
125
CONTINUED (VERSION 1)
Q what is the probability that Robert is home at
t?
CF(q)
is ?P
1
t
time
0
6 pm
t
PF(q) Prob(C) is ? D
126
CONTINUED
KB
FDB
DDB
Prob (C) is D Prob (A) is B
q
q
P(C) is ?D
P(A) is ?B
  • protoformal rule in DDB

Prob (C is D) Prob (A is B)
instantiation
Prob (C is D) Prob (Robert returns from
work at about t) is usually
is usually
127
PROBABILISTIC CONSTRAINT PROPAGATION RULE (a
special nation of the generalized extension
principle)
is P
is ?Q
subject to
128
CONTINUATION
?p membership function of P generalized
extension principle
subject to
129
THE BALLS-IN-BOX EXAMPLE
  • a box contains N balls of various sizes
  • my perceptions are
  • a few are small
  • most are medium
  • a few are large
  • a ball is drawn at random
  • what is the probability that the ball is neither
    small nor large

IDS (initial data set)
130
PERCEPTION-BASED ANALYSIS
a few are small ?Count(small) is few Q1
As are Bs most are medium a few are large
1
N
1
?Count(medium) is most Q2 As are Cs
N
1
?Count(large) is few Q3 As are Ds
N
size of i th ball u

possibility distribution function of induced by
the protoform
Q As are Bs
1
N
131
CONTINUED
possibility distribution function induced by IDS
query (proportion of balls which are neither
large nor small) is? Q
1
N
protoformal deduction rule (extension principle)
subject to
1
N
132
CONCLUSION
  • the goal of realization of an intelligent
    multi-agent decision system is beyond the
    capabilities of measurement-based systems
  • to achieve the goal, it is necessary to employ
    systems which have the capability to operate on
    perception-based information
  • CW, PNL and CTP are intended to provide tools for
    adding this capability to measurement-based
    systems
Write a Comment
User Comments (0)
About PowerShow.com