Selecting Observations against Adversarial Objectives - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

Selecting Observations against Adversarial Objectives

Description:

For most interesting utilities F, NP-hard! Placement B = {S1,..., S5} ... State-of-the-art: [Flaherty et al., NIPS 06] Assume perturbation on Jacobian rf 0(xi) ... – PowerPoint PPT presentation

Number of Views:16
Avg rating:3.0/5.0
Slides: 26
Provided by: andreas45
Category:

less

Transcript and Presenter's Notes

Title: Selecting Observations against Adversarial Objectives


1
Selecting Observations against Adversarial
Objectives
  • Andreas Krause
  • Brendan McMahan
  • Carlos Guestrin
  • Anupam Gupta

TexPoint fonts used in EMF. Read the TexPoint
manual before you delete this box. AAAAAAAAA
2
Observation selection problems
Detectcontaminationsin water networks
Place sensors forbuilding automation
Monitor rivers, lakes using robots
  • Set V of possible observations (sensor
    locations,..)
  • Want to pick subset A µ V such that
  • For most interesting utilities F, NP-hard!

3
Key observation Diminishing returns
Placement B S1,, S5
Placement A S1, S2
Adding S will help a lot!
Adding S doesnt help much
New sensor S
Formalization Submodularity For A µ B, F(A
S) F(A) F(B S) F(B)
4
Submodularitywith Guestrin, Singh, Leskovec,
VanBriesen, Faloutsos, Glance
  • We prove submodularity for
  • Mutual information F(A) H(unobs) H(unobsA)
  • UAI 05, JMLR 07 (Spatial prediction)
  • Outbreak detection F(A) Impact reduction
    sensing A
  • KDD 07 (Water monitoring, )
  • Also submodular
  • Geometric coverage F(A) area covered
  • Variance reduction F(A) Var(Y) Var(YA)

5
Why is submodularity useful?
Greedy Algorithm(forward selection)
  • Theorem Nemhauser et al 78
  • Greedy algorithm gives constant factor
    approximation
  • F(Agreedy) (1-1/e) F(Aopt)
  • Can get online (data dependent) bounds for any
    algorithm
  • Can significantly speed up greedy algorithm
  • Can use MIP / branch bound for optimal solution

6
Robust observation selection
  • What if
  • parameters ? of model P(XV j ?) unknown /
    change?
  • sensors fail?
  • an adversary selects the outbreak scenario?

Morevariabilityhere now?new
Best placement forparameters ?old
Attackhere!
7
Robust prediction
Confidencebands
pH value
Horizontal positions V
Low average variance (MSE) but high maximum (in
most interesting part!)
Typical objective Minimize average variance
(MSE)
  • Instead minimize width of the confidence bands
  • For every location s 2 V, define Fs(A) Var(s)
    Var(sA)
  • Minimize width ? simultaneously maximize all
    Fs(A)
  • Each Fs(A) is (often) submodular! Das Kempe
    07

8
Adversarial observation selection
  • Given
  • Possible observations V,
  • Submodular functions F1,,Fm
  • Want to solve
  • Can model many problems this way
  • Width of confidence bands Fi is variance at
    location i
  • unknown parameters Fi is info-gain with
    parameters ?i
  • adversarial outbreak scenarios Fi is utility for
    scenario i
  • Unfortunately, mini Fi(A) is not submodular ?



One Fi foreach location i
9
How does greedy do?
Set A F1 F2 mini Fi
x 1 0 0
y 0 2 0
z ? ? ?
x,y 1 2 1
x,z 1 ? ?
y,z ? 2 ?
Greedy picks z first
Optimalsolution(k2)
Then, canchoose onlyx or y
  • ? Greedy does arbitrarily badly. Is there
    something better?

Theorem The problem maxA k mini F(A) does
not admit any approximation unless PNP ?
10
Alternative formulation
  • If somebody told us the optimal value,
  • can we recover the optimal solution A?
  • Need to solve dual problem
  • Is this any easier?
  • Yes, if we relax the constraint A k

11
Solving the alternative problem
  • Trick For each Fi and c, define truncation

Fi(A)
Fi(A)
A
Set F1 F2 F1 F2 Favg,1 mini Fi
x 1 0 1 0 ½ 0
y 0 2 0 1 ½ 0
z ? ? ? ? ? ?
x,y 1 2 1 1 1 1
x,z 1 ? 1 ? (1?)/2 ?
y,z ? 2 ? 1 (1?)/2 ?
Lemma
mini Fi(A) c ? Favg,c(A) c
Favg,c(A) is submodular!
12
Why is this useful?
  • Can use the greedy algorithm to find
    (approximate) solution!
  • Proposition Greedy algorithm finds
  • AG with AG ? k and Favg,c(AG) c
  • where ? 1log maxs ?i Fi(s)

13
Back to our example
Set F1 F2 mini Fi Favg,1
x 1 0 0 ½
y 0 2 0 ½
z ? ? ? ?
x,y 1 2 1 1
x,z 1 ? ? (1?)/2
y,z ? 2 ? (1?)/2
  • Guess c1
  • First pick x
  • Then pick y
  • ? Optimal solution! ?
  • How do we find c?

14
Submodular Saturation Algorithm
  • Given set V, integer k and functions F1,,Fm
  • Initialize cmin0, cmax mini Fi(V)
  • Do binary search c (cmincmax)/2
  • Use greedy algorithm to find AG such that
    Favg,c(AG) c
  • If AG gt ? k decrease cmax
  • If AG ? k increase cmin
  • until convergence

AG ? k? c too low
AG gt ? k? c too high
15
Theoretical guarantees
Theorem The problem maxA k mini F(A) does
not admit any approximation unless PNP ?
Theorem Saturate finds a solution AS such
that mini Fi(AS) OPTk and AS ?
k where OPTk maxA k mini Fi(A) ?
1 log maxs ?i Fi(s)
  • Theorem If there were a polytime algorithm
    with better constant ? lt ?, then NPµ DTIME(nlog
    log n)

16
Experiments
  • Minimizing maximum variance in GP regression
  • Robust biological experimental design
  • Outbreak detection against adversarial
    contaminations
  • Goals
  • Compare against state of the art
  • Analyze appropriateness ofworst-case assumption

17
Spatial prediction
better
Environmental monitoring
Precipitation data
  • Compare to state of the art Sacks et.al. 88,
    Wiens 05,
  • Highly tuned simulated annealing heuristics (7
    parameters)
  • Saturate is competitive faster, better on
    larger problems

18
Maximum vs. average variance
better
Environmental monitoring
Precipitation data
  • Minimizing the worst-case leads to good
    average-case score, not vice versa

19
Outbreak detection
better
Water networks
Water networks
  • Results even more prominent on water network
    monitoring (12,527 nodes)

20
Robust experimental design
  • Learn parameters ? of nonlinear function
  • yi f(xi,?) w
  • Choose stimuli xi to facilitate MLE of ?
  • Difficult optimization problem!
  • Common approach linearization!
  • yi ¼ f(xi,?0) rf?0(xi)T (?-?0) w
  • Allows nice closed form (fractional) solution! ?
  • How should we choose ?0??

21
Robust experimental design
  • State-of-the-art Flaherty et al., NIPS 06
  • Assume perturbation on Jacobian rf?0(xi)
  • Solve robust SDP against worst-case perturbation
  • Minimize maximum eigenvalue of estimation error
    (E-optimality)
  • This paper
  • Assume perturbation of initial parameter estimate
    ?0
  • Use Saturate to perform well against all initial
    parameter estimates
  • Minimize MSE of parameter estimate(Bayesian
    A-optimality, typically submodular!)

22
Experimental setup
  • Estimate parameters of Michaelis-Menten model (to
    compare results)
  • Evaluate efficiency of designs

Loss of optimal design, knowing true parameter
?true
Loss of robust design, assuming (wrong) initial
parameter ?0
23
Robust design results
A
B
C
A
B
C
better
Low uncertainty in ?0
High uncertainty in ?0
  • Saturate more efficient than SDP if optimizing
    for high parameter uncertainty

24
Future (current) work
  • Incorporating complex constraints (communication,
    etc.)
  • Dealing with large numbers of objectives
  • Constraint generation
  • Improved guarantees for certain objectives
    (sensor failures)
  • Trading off worst-case and average-case scores

25
Conclusions
  • Many observation selection problems require
    optimizing adversarially chosen submodular
    function
  • Problem not approximable to any factor!
  • Presented efficient algorithm Saturate
  • Achieves optimal score, with bounded increase in
    cost
  • Guarantees are best possible under reasonable
    complexity assumptions
  • Saturate performs well on real-world problems
  • Outperforms state-of-the-art simulated annealing
    algorithms for sensor placement, no parameters to
    tune
  • Compares favorably with SDP based solutions for
    robust experimental design
Write a Comment
User Comments (0)
About PowerShow.com