On the Capacity of Information Networks - PowerPoint PPT Presentation

About This Presentation
Title:

On the Capacity of Information Networks

Description:

... with metric embeddings via LP duality. Approximate max-flow min-cut ... duality ... LP duality; directed or undirected. Does rate-1 coding solution imply ... – PowerPoint PPT presentation

Number of Views:45
Avg rating:3.0/5.0
Slides: 52
Provided by: nickh8
Category:

less

Transcript and Presenter's Notes

Title: On the Capacity of Information Networks


1
On the Capacity of Information Networks
  • Nick Harvey
  • Collaborators Micah Adler (UMass), Kamal Jain
    (Microsoft), Bobby Kleinberg (MIT/Berkeley/Cornell
    ), and April Lehman (MIT/Google)

2
What is the capacity of a network?
3
What is the capacity of a network?
s2
s1
t1
t2
  • Send items from s1?t1 and s2?t2
  • Problem no disjoint paths

4
An Information Network
s2
s1
t1
t2
  • If sending information, we can do better
  • Send xor b1?b2 on bottleneck edge

5
Moral of Butterfly
Network Flow Capacity ? Information Flow Capacity
6
Network Coding
  • New approach for information flow problems
  • Blend of combinatorial optimization, information
    theory
  • Multicast, k-Pairs
  • k-Pairs problems Network coding when each
    commodity has one sink
  • Analogous to multicommodity flow
  • Definitions for cyclic networks are subtle

7
  • Multicommodity
  • Flow
  • Efficient algorithms for computing maximum
    concurrent (fractional) flow.
  • Connected with metric embeddings via LP duality.
  • Approximate max-flow min-cut theorems.
  • Network
  • Coding
  • Computing the max concurrent coding rate may be
  • Undecidable
  • Decidable in poly-time
  • No adequate duality theory.
  • No cut-based parameter is known to give sublinear
    approximation in digraphs.

Directed and undirected problems behave quite
differently
8
Directed k-pairs
  • Coding rate can be muchlarger than flow rate!
  • Butterfly
  • Coding rate 1
  • Flow rate ½
  • Thm HKL04,LL04 ? graphs G(V,E) whereCoding
    Rate O( flow rate V )

s1
s2
t2
t1
  • Thm ? graphs G(V,E) whereCoding Rate O( flow
    rate E )
  • And this is optimal
  • Recurse on butterfly construction

9
Directed k-pairs
  • Coding rate can be muchlarger than flow rate!
  • and much larger than the sparsity(same
    example)

Flow Rate ? Sparsity lt Coding Rate
in some graphs
10
Undirected k-pairs
  • No known undirected instance where coding rate ?
    max flow rate!
  • (The undirected k-pairs conjecture)

Flow Rate ? Coding Rate ? Sparsity
Gap can be O(log n) when G is an expander
11
Undirected k-Pairs Conjecture
Coding Rate
?
?
Sparsity
Flow Rate
Unknown until this work
Undirected k-pairs conjecture
12
Okamura-Seymour Graph
Cut
Every cut has enough capacity to carry all
commodities separated by the cut
13
Okamura-Seymour Max-Flow
Flow Rate 3/4
si is 2 hops from ti. At flow rate r, each
commodity consumes ? 2r units of bandwidth in a
graph with only 6 units of capacity.
14
The trouble with information flow
s1
t3
  • If an edge codes multiple commodities, how to
    charge for consuming bandwidth?
  • We work around this obstacle and bound coding
    rate by 3/4.

s2
s4
t1
t4
s3
t2
At flow rate r, each commodity consumes at least
2r units of bandwidth in a graph with only 6
units of capacity.
15
Informational Dominance
  • Definition A e if for every coding
    solution,the messages sent on edges of A
    uniquely determine the message sent on e.
  • Given A and e, how hard is it to determine
    whether A e? Is it even decidable?
  • Theorem There is an algorithm tocompute whether
    A e in time O(k²m).
  • Based on a combinatorial characterizationof
    informational dominance

16
What can we prove?
  • Combine Informational Dominance with Shannon
    inequalities for Entropy
  • Flow rate coding rate for Special Bipartite
    Graphs
  • Bipartite
  • Every source is 2 hopsaway from its sink
  • Dual of flow LP is optimizedby assigning length
    1 to all edges
  • Next show that proving conjecture for all graphs
    is quite hard

s1 t3
s4
t4
s2 t1
s3 t2
17
k-pairs conjecture I/O complexity
  • I/O complexity model AV88
  • A large, slow external memory consisting of pages
    each containing p records
  • A fast internal memory that holds 2 pages
  • Basic I/O operation read in two pages from
    external memory, write out one page

18
I/O Complexity of Matrix Transposition
  • Matrix transposition Given a pp matrix of
    records in row-major order, write it out in
    column-major order.
  • Obvious algorithm requires O(p²) ops.
  • A better algorithm uses O(p log p) ops.

19
I/O Complexity of Matrix Transposition
  • Matrix transposition Given a pp matrix of
    records in row-major order, write it out in
    column-major order.
  • Obvious algorithm requires O(p²) ops.
  • A better algorithm uses O(p log p) ops.

s1
s2
20
I/O Complexity of Matrix Transposition
  • Matrix transposition Given a pxp matrix of
    records in row-major order, write it out in
    column-major order.
  • Obvious algorithm requires O(p²) ops.
  • A better algorithm uses O(p log p) ops.

s1
s2
s3
s4
21
I/O Complexity of Matrix Transposition
  • Matrix transposition Given a pxp matrix of
    records in row-major order, write it out in
    column-major order.
  • Obvious algorithm requires O(p²) ops.
  • A better algorithm uses O(p log p) ops.

s1
s2
s3
s4
t3
t1
22
I/O Complexity of Matrix Transposition
  • Matrix transposition Given a pxp matrix of
    records in row-major order, write it out in
    column-major order.
  • Obvious algorithm requires O(p²) ops.
  • A better algorithm uses O(p log p) ops.

s1
s2
s3
s4
t3
t4
t1
t2
23
Matching Lower Bound
  • Theorem (Floyd 72, AV88) A matrix
    transposition algorithm using only read and write
    operations (no arithmetic on values) must perform
    O(p log p) I/O operations.

s1
s2
s3
s4
t3
t4
t1
t2
24
O(p log p) Lower Bound
  • Proof Let Nij denote the number of ops in which
    record (i,j) is written. For all j,
  • Si Nij p log p.
  • Hence
  • Sij Nij p² log p.
  • Each I/O writes only p records. QED.

s1
s2
s3
s4
t3
t4
t1
t2
25
The k-pairs conjecture and I/O complexity
  • Definition An oblivious algorithm is one whose
    pattern of read/write operations does not depend
    on the input.
  • Theorem If there is an oblivious algorithm for
    matrix transposition using o(p log p) I/O ops,
    the undirected k-pairs conjecture is false.

s1
s2
s3
s4
t3
t4
t1
t2
26
The k-pairs conjecture and I/O complexity
  • Proof
  • Represent the algorithm with a diagram as before.
  • Assume WLOG that each node has only two outgoing
    edges.

s1
s2
s3
s4
t3
t4
t1
t2
27
The k-pairs conjecture and I/O complexity
  • Proof
  • Represent the algorithm with a diagram as before.
  • Assume WLOG that each node has only two outgoing
    edges.
  • Make all edges undirected, capacity p.
  • Create a commodity for each matrix entry.

s1
s2
s3
s4
t3
t4
t1
t2
28
The k-pairs conjecture and I/O complexity
  • Proof
  • The algorithm itself is a network code of rate 1.
  • Assuming the k-pairs conjecture, there is a flow
    of rate 1.
  • Si,jd(si,tj) p E(G).
  • Arguing as before, LHS is O(p² log p).
  • Hence E(G)O(p log p).

s1
s2
s3
s4
t3
t4
t1
t2
29
Other consequences for complexity
  • The undirected k-pairs conjecture implies
  • A O(p log p) lower bound for matrix transposition
    in the cell-probe model.
  • Same proof.
  • A O(p² log p) lower bound for the running time of
    oblivious matrix transposition algorithms on a
    multi-tape Turing machine.
  • I/O model can emulate multi-tape Turing
    machines with a factor p speedup.

30
Distance arguments
  • Rate-1 flow solution implies Si d(si,ti) E
  • LP duality directed or undirected
  • Does rate-1 coding solution implySi d(si,ti)
    E?
  • Undirected graphs this is essentially
    thek-pairs conjecture!
  • Directed graphs this is completely false

31
Recursive construction
  • k commodities (si,ti)
  • Distance d(si,ti) O(log k) ?i
  • O(k) edges!

32
Recursive Construction
s1
s2
2 commodities 7 edges Distance 3
G (1)
t1
t2
  • Equivalent to

s2
s1
Edge capacity 1
t1
t2
33
Recursive Construction
s3
s4
s1
s2
G (2)
t3
t4
t1
t2
  • Start with two copies of G (1)

34
Recursive Construction
s3
s4
s1
s2
G (2)
t3
t4
t1
t2
  • Replace middle edges with copy of G (1)

35
Recursive Construction
s3
s4
s1
s2
G (1)
G (2)
t3
t4
t1
t2
  • 4 commodities, 19 edges, Distance 5

36
Recursive Construction
s1
s2
s3
s4
s2n-1
s2n
G (n-1)
G (n)
t1
t2
t3
t4
t2n-1
t2n
  • commodities 2n, V O(2n), E O(2n)
  • Distance 2n1

37
Summary
  • Directed instances
  • Coding rate gtgt flow rate
  • Undirected instances
  • Conjecture Flow rate Coding rate
  • Proof for special bip graphs
  • Tool Informational Dominance
  • Proving conjecture solves MatrixTransposition
    Problem

38
Open Problems
  • Computing the network coding rate in DAGs
  • Recursively decidable?
  • How do you compute a o(n)-factor approximation?
  • Undirected k-pairs conjecture
  • Stronger complexity consequences?
  • Prove a O(log n) gap between sparsest cut and
    coding rate for some graphs
  • or, find a fast matrix transposition algorithm.

39
Backup Slides
40
Optimality
  • The graph G (n) provesThm HKL05 ? graphs
    G(V,E) whereNCR O( flow rate E )
  • G (n) is optimalThm HKL05 ? graph
    G(V,E),NCR/flow rate O(min V,E,k)

41
Informational Dominance
  • Def A dominates B if information in A determines
    information in Bin every network coding solution.

s1
s2
A does not dominate B
t2
t1
42
Informational Dominance
  • Def A dominates B if information in A determines
    information in Bin every network coding solution.

s1
s2
A dominates B
Sufficient Condition If no path from any source
? B then A dominates B
t2
t1
43
Informational Dominance Example
  • Obviously flow rate NCR 1
  • How to prove it? Markovicity?
  • No two edges disconnect t1 and t2 from both
    sources!

44
Informational Dominance Example
s1
s2
t1
Cut A
t2
Sufficient Condition If no path from any source
? B then A dominates B
45
Informational Dominance Example
s1
s2
t1
Cut A
t2
  • Our characterization implies thatA dominates
    t1,t2 ? H(A) ? H(t1,t2)

46
Rate ¾ for Okamura-Seymour
s1
s1
t3
s1 t3
i
s4
t4
s2 t1
s3
s3 t2
47
Rate ¾ for Okamura-Seymour
s1 t3
i
s4
t4
s2 t1
i
s3 t2
i
48
Rate ¾ for Okamura-Seymour
s1 t3
i
s4
t4
s2 t1
i
s3 t2
i
49
Rate ¾ for Okamura-Seymour
s1 t3
s4
t4
s2 t1
s3 t2



50
Rate ¾ for Okamura-Seymour
s1 t3
s4
t4
s2 t1
s3 t2


51
Rate ¾ for Okamura-Seymour
s1 t3
s4
t4
s2 t1
¾ RATE
3 H(source) 6 H(undirected edge) 11 H(source)
6 H(undirected edge) 8 H(source)
s3 t2

Write a Comment
User Comments (0)
About PowerShow.com