Importance Sampling - PowerPoint PPT Presentation

About This Presentation
Title:

Importance Sampling

Description:

Advances in Gibbs sampling. Blocking. Cutset sampling (Rao-Blackwellisation) ... Advances in Importance Sampling. Particle Filtering. Importance Sampling Theory ... – PowerPoint PPT presentation

Number of Views:332
Avg rating:3.0/5.0
Slides: 28
Provided by: Informatio367
Learn more at: https://ics.uci.edu
Category:

less

Transcript and Presenter's Notes

Title: Importance Sampling


1
Importance Sampling
  • ICS 276
  • Fall 2007
  • Rina Dechter

2
Outline
  • Gibbs Sampling
  • Advances in Gibbs sampling
  • Blocking
  • Cutset sampling (Rao-Blackwellisation)
  • Importance Sampling
  • Advances in Importance Sampling
  • Particle Filtering

3
Importance Sampling Theory
4
Importance Sampling Theory
  • Given a distribution called the proposal
    distribution Q (such that P(Zz,e)gt0gt Q(Zz)gt0)

w(Zz) is called as importance weight
5
Importance Sampling Theory
Underlying principle, Approximate Average over a
set of numbers by an average over a set of
sampled numbers
6
Importance Sampling (Informally)
  • Express the problem as computing the average over
    a set of real numbers
  • Sample a subset of real numbers
  • Approximate the true average by sample average.
  • True Average
  • Average of (0.11, 0.24, 0.55, 0.77,
    0.88,0.99)0.59
  • Sample Average over 2 samples
  • Average of (0.24, 0.77) 0.505

7
How to generate samples from Q
  • Express Q in product form
  • Q(Z)Q(Z1)Q(Z2Z1).Q(ZnZ1,..Zn-1)
  • Sample along the order Z1,..Zn
  • Example
  • Q(Z1)(0.2,0.8)
  • Q(Z2Z1)(0.2,0.8,0.1,0.9)
  • Q(Z3Z1,Z2)Q(Z3Z1)(0.5,0.5,0.3,0.7)

8
How to sample from Q
Q(Z1)(0.2,0.8) Q(Z2Z1)(0.2,0.8,0.1,0.9) Q(Z3Z1
,Z2)Q(Z3Z1)(0.5,0.5,0.3,0.7)
Domains of each variable is 0,1
  • Generate a random number between 0 and 1

0
1
0.2
Which value to select for Z1?
1
0
9
How to sample from Q?
  • Each Sample Zz
  • Sample Z1z1 from Q(Z1)
  • Sample Z2z2 from Q(Z2Z1z1)
  • Sample Z3z3 from Q(Z3Z1z1)
  • Generate N such samples

10
Likelihood weighting
  • Q Prior DistributionCPTs of the Bayesian network

11
Likelihood weighting example
P(S)
Smoking
P(BS)
P(CS)
lung Cancer
Bronchitis
P(XC,S)
P(DC,B)
X-ray
Dyspnoea
P(S, C, B, X, D) P(S) P(CS) P(BS) P(XC,S)
P(DC,B)
12
Likelihood weighting example
P(S)
Smoking
QPrior Q(S,C,D)Q(S)Q(CS)Q(DC,B0) P(S)P(CS
)P(DC,B0)
P(BS)
P(CS)
lung Cancer
Bronchitis
Sample Ss from P(S) Sample Cc from
P(CSs) Sample Dd from P(DCc,B0)
P(XC,S)
P(DC,B)
X-ray
Dyspnoea
13
The Algorithm
14
How to solve belief updating?
15
Difference between estimating P(Ee) and
P(XixiEe)
Unbiased
Asymptotically Unbiased
16
Proposal Distribution Which is better?
17
Outline
  • Gibbs Sampling
  • Advances in Gibbs sampling
  • Blocking
  • Cutset sampling (Rao-Blackwellisation)
  • Importance Sampling
  • Advances in Importance Sampling
  • Particle Filtering

18
Research Issues in Importance Sampling
  • Better Proposal Distribution
  • Likelihood weighting
  • Fung and Chang, 1990 Shachter and Peot, 1990
  • AIS-BN
  • Cheng and Druzdzel, 2000
  • Iterative Belief Propagation
  • Changhe and Druzdzel, 2003
  • Iterative Join Graph Propagation and variable
    ordering
  • Gogate and Dechter, 2005

19
Research Issues in Importance Sampling (Cheng and
Druzdzel 2000)
  • Adaptive Importance Sampling

20
Adaptive Importance Sampling
  • General case
  • Given k proposal distributions
  • Take N samples out of each distribution
  • Approximate P(e)

21
Estimating Q'(z)
22
Cutset importance sampling
(Gogate and Dechter, 2005) and (Bidyuk and
Dechter 2006)
  • Divide the Set of variables into two parts
  • Cutset (C) and Remaining Variables (R)

23
Outline
  • Gibbs Sampling
  • Advances in Gibbs sampling
  • Blocking
  • Cutset sampling (Rao-Blackwellisation)
  • Importance Sampling
  • Advances in Importance Sampling
  • Particle Filtering

24
Dynamic Belief Networks (DBNs)
Transition arcs
Xt
Xt1
Yt
Yt1
Bayesian Network at time t
Bayesian Network at time t1
X10
X0
X1
X2
Y10
Y0
Y1
Y2
Unrolled DBN for t0 to t10
25
Query
  • Compute P(X 0t Y 0t ) or P(X t Y 0t )
  • Example P(X010Y010) or P(X10Y010)
  • Hard!!! over a long time period
  • Approximate! Sample!

26
Particle Filtering (PF)
  • condensation
  • sequential Monte Carlo
  • survival of the fittest
  • PF can treat any type of probability
    distribution, non-linearity, and
    non-stationarity
  • PF are powerful sampling based inference/learning
    algorithms for DBNs.

27
Particle Filtering
On white board
Write a Comment
User Comments (0)
About PowerShow.com