TY - GEN

T1 - Scrambling adversarial errors using few random bits, optimal information reconciliation, and better private codes

AU - Smith, Adam

PY - 2007/1/1

Y1 - 2007/1/1

N2 - Communicating over a noisy channel is typically much easier when errors are drawn from a fixed, known distribution than when they are chosen adversarially. This paper looks at how one can use schemes designed for random errors in an adversarial context, at the cost of few additional random bits and without relying on unproven computational assumptions. The basic approach is to permute the positions of a bit string using a permutation drawn from a t-wise independent family, where t = o(n). This leads to several new results: • We show that concatenated codes can correct errors up to the Shannon capacity even when the errors are only slightly random - it is sufficient that they be t-wise independently distributed, for t roughly ω(log n). • We construct computationally efficient information reconciliation protocols correcting pn adversarial binary Hamming errors with optimal communication complexity and entropy loss n(h(p)+o(1)) bits, where n is the length of the strings and h() is the binary entropy function. Information reconciliation protocols allow cooperating parties to correct errors in a shared string. They are important tools in two applications: first, for dealing with noisy secrets in cryptography; second, for synchronizing remote copies of large files. Entropy loss measures how much information is leaked to an eavesdropper listening in on the protocol. • We improve the randomness complexity (key length) of efficiently decodable capacity-approaching private codes from Θ (n log n) to n + o(n). We also present a simplified proof of an existential result on private codes due to Langberg (FOCS '04).

AB - Communicating over a noisy channel is typically much easier when errors are drawn from a fixed, known distribution than when they are chosen adversarially. This paper looks at how one can use schemes designed for random errors in an adversarial context, at the cost of few additional random bits and without relying on unproven computational assumptions. The basic approach is to permute the positions of a bit string using a permutation drawn from a t-wise independent family, where t = o(n). This leads to several new results: • We show that concatenated codes can correct errors up to the Shannon capacity even when the errors are only slightly random - it is sufficient that they be t-wise independently distributed, for t roughly ω(log n). • We construct computationally efficient information reconciliation protocols correcting pn adversarial binary Hamming errors with optimal communication complexity and entropy loss n(h(p)+o(1)) bits, where n is the length of the strings and h() is the binary entropy function. Information reconciliation protocols allow cooperating parties to correct errors in a shared string. They are important tools in two applications: first, for dealing with noisy secrets in cryptography; second, for synchronizing remote copies of large files. Entropy loss measures how much information is leaked to an eavesdropper listening in on the protocol. • We improve the randomness complexity (key length) of efficiently decodable capacity-approaching private codes from Θ (n log n) to n + o(n). We also present a simplified proof of an existential result on private codes due to Langberg (FOCS '04).

UR - http://www.scopus.com/inward/record.url?scp=84969141630&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84969141630&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84969141630

T3 - Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms

SP - 395

EP - 404

BT - Proceedings of the 18th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2007

PB - Association for Computing Machinery

T2 - 18th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2007

Y2 - 7 January 2007 through 9 January 2007

ER -