Title: Simple PCPs
1Local Testability and Decodability of Sparse
Linear Codes
Madhu Sudan MIT
Joint work with Tali Kaufman (IAS MIT).
2Local (Sublinear-time) Algorithmics
- Data getting ever-larger
- Need algorithms that can infer global
properties from local observations - Led to
- Property testing, Sublinear-time algorithms
- Common themes
- Oracle-access to input, implicit output.
- Answers of the form input close to having
property
3Error-Correcting Codes
- Code
- Distance
- between sequences
- of code
- Algorithmic Problems
- Encode
- Detect Errors
-
- Decode
4Local Algorithmics in Coding
- Encoding Can not be performed locally
- Single bit change in input should alter constant
fraction of output! - Testing, Decoding, Error-correcting can be
performed locally. Furthermore - They are very natural problems.
- Have many applications in theory (PCP, PIR,
Hardness amplification). - Lots of interesting effects are achievable.
5Local Algorithmic Problems
- Common framework
- Local Testing
- Local Self-Correction
- Local Decoding
6Example Hadamard Codes
- Encoding
- Test
- Correction
- Decoding
7Brief History
- Local Decoding/Self-Correcting
- Beaver-Feigenbaum, Lipton, Blum-Luby-Rubinfel
d instances of Local Decodability. - Katz-Trevisan first definition.
-
- Locally Testable Codes
- Blum-Luby-Rubinfeld, Babai-Fortnow-Lund
first instances. - Arora, Rubinfeld-Sudan, Spielman,
Goldreich-Sudan definitions. -
8Constructions of Locally X-able Codes
- Basic codes Algebraic in nature.
- Analysis
- Decoding typically simple, uses algebra.
- Testing more complex.
- Better codes Careful compositions of basic
codes. - Exception Meir 08 not algebraic.
- Questions
- Do we need all this algebra/careful
constructions? - Can we derive local algorithms from classical
parameters? - Can randomly chosen codes have local algorithms?
9Our Results
- Theorem (Informal) Every sparse, linear code
of large distance is locally testable,
correctible. - Linear?
- Sparse?
- Large Distance?
10Our Results (contd.)
- Linear?
- Sparse?
- Large Distance?
- Balanced?
11Corollaries
- Reproduce old results Hadamard, dual-BCH
- New codes
- Random sparse linear codes (decodable under any
linear encoding). - dual-BCH variants
- Nice closure properties (Subcodes, Addition of
new coordinates, removal of few coordinates)
12Previously
- Kaufman-Litsyn Similar result techniques.
Main differences - Required
- Worked only for balanced codes.
- Only proved local testability no correctibility
13Proof Techniques
- Modifying (simplifying? extending?) the proofs of
Kaufman Litsyn 05 (some ideas go back to Kiwi
95). - Buzzwords Duality, MacWilliams Identities,
Krawtchouk Polynomials, Johnson bounds.
14Duality Testing
- Dual of a Code
- Canonical (only) test for membership in C
- Canonical self-corrector
15Questions
16Path to answers
- Need weight distribution of some codes
- Testing Correcting
- Testing Kiwi, KL
- Correcting New
17Dual Weight Distribution?
- MacWilliams Identities Can compute weight
distribution of dual from weight distribution of
primal exactly! - Dont have primal distribution exactly Can
coarse information suffice? - Kiwi - Manages to compute primal info. exactly.
- Kaufman-Litsyn Find out a lot about primal
distribution. - Our hope Less precise info. sufficient.
18MacWilliams Identities Precise Form
- Krawtchouk Polynomials
- Dual Weight Distribution
- Double summation! Many negative terms.
Cancellations?
19Primal Weight Distribution (Balanced)
20Krawtchouk Polynomial (k odd)
21Krawtchouk Polynomial (k odd)
22Low-weight codewords in dual
- Can conclude constant weight codewords exist.
- Very tight bound
- Leads to self-corrector
23Analysis of self-corrector
- Need to understand
- New Code
- Claim
- But
24Analysis of self-corrector (contd.)
- Plugging in bounds
- Similar calculations with yield
- Conclude Self-corrector computes
correctly w.p. from
-corrupted received word.
25Analysis of Tester (balanced case)
- Need to analyze
- Specifically, want
- Easy fact (from MacWilliams Identities)
- Suffices to analyze second term. But what does
the weight distribution of look like?
and how does interact with this?
26Weight Distribution of Cr (vs. C)
27Weight Distribution of Cr (vs. C)
28Inner Product with Krawtchouks
29Inner Product with Krawtchouks
Helps! But by how much?
30More Bounds
- Some weak Krawtchouk bounds
- Bound 2. not sufficient to bound the hurt but
can combine with Johnson bound - Johnson Bound
(the helpful part)
(For i in our range. Useful to limit the hurt)
31Putting all the bounds together
- Can conclude
- Implies test rejects -corrupted codeword with
probability
32Unbalanced codes?
- Many things breakdown
- E.g.,
- Our approach
- Step 1
- Step 2
33Weakly balanced codes
- Can now prove
- But cant get a precise bound on
- Instead, we bound
directly - Show that contribution of any word to both terms
is roughly the same (Uses some properties of
.) - Show that contribution of the coset leader drops
by -factor.
34Reducing general codes to w.b. codes
- Write where is
weakly-balanced. - Test if such that
- Yields tester for all binary, linear, sparse,
high-distance codes.
35Conclusions/Questions
- Simpler proof for random codes by Shachar Lovett,
Or Meir. - Self-correct imbalanced codes?
- Are random sparse codes locally list-decodable?
- Is this just a logarithmic saving in locality?
- Are there other ways to pick broad classes of
testable codes (at random)?