Title: Graph Sparsification by Effective Resistances
1Graph Sparsification by Effective Resistances
- Daniel Spielman
- Nikhil Srivastava
- Yale
2Sparsification
- Approximate any graph G by a sparse graph H.
- Nontrivial statement about G
- H is faster to compute with than G
G
H
3Cut Sparsifiers Benczur-Karger96
- H approximates G if
- for every cut S½V
- sum of weights of edges leaving S is preserved
-
-
- Can find H with O(nlogn/?2) edges in
time
S
S
4The Laplacian (quick review)
- Quadratic form
- Positive semidefinite
- Ker(LG)span(1) if G is connected
5Cuts and the Quadratic Form
- For characteristic vector
- So BK says
6A Stronger Notion
- For characteristic vector
- So BK says
7 81. All eigenvalues are preserved
- By Courant-Fischer,
- G and H have similar eigenvalues.
- For spectral purposes, G and H are equivalent.
-
91. All eigenvalues are preserved
- By Courant-Fischer,
- G and H have similar eigenvalues.
- For spectral purposes, G and H are equivalent.
-
cf. matrix sparsifiers AM01,FKV04,AHK05
102. Linear System Solvers
ignore
(time to mult. by A)
112. Preconditioning
- Find easy that approximates .
- Solve instead.
-
Time to solve (mult.by )
122. Preconditioning
Use BLH ?
- Find easy that approximates .
- Solve instead.
-
?
Time to solve (mult.by )
132. Preconditioning
Spielman-Teng STOC 04 Nearly linear time.
- Find easy that approximates .
- Solve instead.
-
Time to solve (mult.by )
14 15Example Sparsify Complete Graph by
Ramanujan Expander
G is complete on n vertices.
H is d-regular Ramanujan graph.
16Example Sparsify Complete Graph by
Ramanujan Expander
G is complete on n vertices.
H is d-regular Ramanujan graph.
So, is a good sparsifier for G.
Each edge has weight (n/d)
17Example Dumbell
Kn
Kn
1
d-regular Ramanujan, times n/d
d-regular Ramanujan, times n/d
1
18Example Dumbell
19Example Dumbell. Must include cut edge
Kn
Kn
e
Only this edge contributes to
If
20 21Main Theorem
- Every G(V,E,c) contains H(V,F,d) with
O(nlogn/?2) edges such that
22Main Theorem
- Every G(V,E,c) contains H(V,F,d) with
O(nlogn/?2) edges such that - Can find H in time by random
sampling.
23Main Theorem
- Every G(V,E,c) contains H(V,F,d) with
O(nlogn/?2) edges such that - Can find H in time by random
sampling.
Improves BK96 Improves O(nlogc n) sparsifiers
ST04
24 25Effective Resistance
Identify each edge of G with a unit resistor
is resistance between endpoints of e
1
v
u
1
1
a
26Effective Resistance
Identify each edge of G with a unit resistor
is resistance between endpoints of e
1
v
u
Resistance of path is 2
1
1
a
27Effective Resistance
Identify each edge of G with a unit resistor
is resistance between endpoints of e
1
v
u
Resistance from u to v is
Resistance of path is 2
1
1
a
28Effective Resistance
Identify each edge of G with a unit resistor
is resistance between endpoints of e
-1
1
v
u
1/3
-1/3
a
0
29Effective Resistance
Identify each edge of G with a unit resistor
is resistance between endpoints of e
?V
1
-1
potential difference between endpoints when
flow one unit from one endpoint to other
30Effective Resistance
?V
1
-1
Chandra et al. STOC 89
31The Algorithm
- Sample edges of G with probability
-
- If chosen, include in H with weight
- Take qO(nlogn/?2) samples with replacement
- Divide all weights by q.
32An algebraic expression for
33An algebraic expression for
- Orient G arbitrarily.
- Signed incidence matrix Bm n
34An algebraic expression for
- Orient G arbitrarily.
- Signed incidence matrix Bm n
- Write Laplacian as
35An algebraic expression for
36An algebraic expression for
37An algebraic expression for
38An algebraic expression for
Reduce thm. to statement about ?
39Goal
Want
40Sampling in ?
41Reduction to ?
42New Goal
43The Algorithm
- Sample edges of G with probability
-
- If chosen, include in H with weight
- Take qO(nlogn/?2) samples with replacement
- Divide all weights by q.
44The Algorithm
- Sample columns of with probability
-
- If chosen, include in with weight
- Take qO(nlogn/?2) samples with replacement
- Divide all weights by q.
45The Algorithm
- Sample columns of with probability
-
- If chosen, include in with weight
- Take qO(nlogn/?2) samples with replacement
- Divide all weights by q.
46The Algorithm
- Sample columns of with probability
-
- If chosen, include in with weight
- Take qO(nlogn/?2) samples with replacement
- Divide all weights by q.
47The Algorithm
- Sample columns of with probability
-
- If chosen, include in with weight
- Take qO(nlogn/?2) samples with replacement
- Divide all weights by q.
cf. low-rank approx. FKV04,RV07
48A Concentration Result
49A Concentration Result
So with prob. ½
50A Concentration Result
So with prob. ½
51 52The Algorithm
- Sample edges of G with probability
-
- If chosen, include in H with weight
- Take qO(nlogn/?2) samples with replacement
- Divide all weights by q.
53The Algorithm
- Sample edges of G with probability
-
- If chosen, include in H with weight
- Take qO(nlogn/?2) samples with replacement
- Divide all weights by q.
54Nearly Linear Time
55Nearly Linear Time
- So care about distances between cols. of BL-1
56Nearly Linear Time
- So care about distances between cols. of BL-1
- Johnson-Lindenstrauss! Take random Qlogn m
- Set ZQBL-1
57Nearly Linear Time
58Nearly Linear Time
- Find rows of Zlog n n by
- ZQBL-1
- ZLQB
- ziL(QB)i
59Nearly Linear Time
- Find rows of Zlog n n by
- ZQBL-1
- ZLQB
- ziL(QB)i
- Solve O(logn) linear systems in L using
Spielman-Teng 04 solver - which uses combinatorial O(nlogcn) sparsifier.
- Can show approximate Reff suffice.
60Main Conjecture
- Sparsifiers with O(n) edges.
61Example Another edge to include
m-1
1
k-by-k complete bipartite
0
m
k-by-k complete bipartite
1
m-1
61
62The Projection Matrix
- Lemma.
- ? is a projection matrix
- im(?)im(B)
- Tr(?)n-1
- ?(e,e)?(e,-)2
63Last Steps
64Last Steps
65Last Steps
66Last Steps
67Last Steps
- We also have
- and
- since ?e2?(e,e).
68Reduction to ?
69Reduction to ?
70Reduction to ?
71Reduction to ?
72Reduction to ?
73Reduction to ?
74Reduction to ?
75Reduction to ?
76Reduction to ?
- Lemma.
- Proof. ? is the projection onto im(B).