Title: Importance Sampling
1Importance Sampling
- ICS 276
- Fall 2007
- Rina Dechter
2Outline
- Gibbs Sampling
- Advances in Gibbs sampling
- Blocking
- Cutset sampling (Rao-Blackwellisation)
- Importance Sampling
- Advances in Importance Sampling
- Particle Filtering
3Importance Sampling Theory
4Importance Sampling Theory
- Given a distribution called the proposal
distribution Q (such that P(Zz,e)gt0gt Q(Zz)gt0)
w(Zz) is called as importance weight
5Importance Sampling Theory
Underlying principle, Approximate Average over a
set of numbers by an average over a set of
sampled numbers
6Importance Sampling (Informally)
- Express the problem as computing the average over
a set of real numbers - Sample a subset of real numbers
- Approximate the true average by sample average.
- True Average
- Average of (0.11, 0.24, 0.55, 0.77,
0.88,0.99)0.59 - Sample Average over 2 samples
- Average of (0.24, 0.77) 0.505
7How to generate samples from Q
- Express Q in product form
- Q(Z)Q(Z1)Q(Z2Z1).Q(ZnZ1,..Zn-1)
- Sample along the order Z1,..Zn
- Example
- Q(Z1)(0.2,0.8)
- Q(Z2Z1)(0.2,0.8,0.1,0.9)
- Q(Z3Z1,Z2)Q(Z3Z1)(0.5,0.5,0.3,0.7)
8How to sample from Q
Q(Z1)(0.2,0.8) Q(Z2Z1)(0.2,0.8,0.1,0.9) Q(Z3Z1
,Z2)Q(Z3Z1)(0.5,0.5,0.3,0.7)
Domains of each variable is 0,1
- Generate a random number between 0 and 1
0
1
0.2
Which value to select for Z1?
1
0
9How to sample from Q?
- Each Sample Zz
- Sample Z1z1 from Q(Z1)
- Sample Z2z2 from Q(Z2Z1z1)
- Sample Z3z3 from Q(Z3Z1z1)
- Generate N such samples
10Likelihood weighting
- Q Prior DistributionCPTs of the Bayesian network
11Likelihood weighting example
P(S)
Smoking
P(BS)
P(CS)
lung Cancer
Bronchitis
P(XC,S)
P(DC,B)
X-ray
Dyspnoea
P(S, C, B, X, D) P(S) P(CS) P(BS) P(XC,S)
P(DC,B)
12Likelihood weighting example
P(S)
Smoking
QPrior Q(S,C,D)Q(S)Q(CS)Q(DC,B0) P(S)P(CS
)P(DC,B0)
P(BS)
P(CS)
lung Cancer
Bronchitis
Sample Ss from P(S) Sample Cc from
P(CSs) Sample Dd from P(DCc,B0)
P(XC,S)
P(DC,B)
X-ray
Dyspnoea
13The Algorithm
14How to solve belief updating?
15Difference between estimating P(Ee) and
P(XixiEe)
Unbiased
Asymptotically Unbiased
16Proposal Distribution Which is better?
17Outline
- Gibbs Sampling
- Advances in Gibbs sampling
- Blocking
- Cutset sampling (Rao-Blackwellisation)
- Importance Sampling
- Advances in Importance Sampling
- Particle Filtering
18Research Issues in Importance Sampling
- Better Proposal Distribution
- Likelihood weighting
- Fung and Chang, 1990 Shachter and Peot, 1990
- AIS-BN
- Cheng and Druzdzel, 2000
- Iterative Belief Propagation
- Changhe and Druzdzel, 2003
- Iterative Join Graph Propagation and variable
ordering - Gogate and Dechter, 2005
19Research Issues in Importance Sampling (Cheng and
Druzdzel 2000)
- Adaptive Importance Sampling
20Adaptive Importance Sampling
- General case
- Given k proposal distributions
- Take N samples out of each distribution
- Approximate P(e)
21Estimating Q'(z)
22Cutset importance sampling
(Gogate and Dechter, 2005) and (Bidyuk and
Dechter 2006)
- Divide the Set of variables into two parts
- Cutset (C) and Remaining Variables (R)
23Outline
- Gibbs Sampling
- Advances in Gibbs sampling
- Blocking
- Cutset sampling (Rao-Blackwellisation)
- Importance Sampling
- Advances in Importance Sampling
- Particle Filtering
24Dynamic Belief Networks (DBNs)
Transition arcs
Xt
Xt1
Yt
Yt1
Bayesian Network at time t
Bayesian Network at time t1
X10
X0
X1
X2
Y10
Y0
Y1
Y2
Unrolled DBN for t0 to t10
25Query
- Compute P(X 0t Y 0t ) or P(X t Y 0t )
- Example P(X010Y010) or P(X10Y010)
- Hard!!! over a long time period
- Approximate! Sample!
26Particle Filtering (PF)
- condensation
- sequential Monte Carlo
- survival of the fittest
- PF can treat any type of probability
distribution, non-linearity, and
non-stationarity - PF are powerful sampling based inference/learning
algorithms for DBNs.
27Particle Filtering
On white board