1 / 29

Extractors: applications and constructions

Randomness. Extractors: applications and constructions. Avi Wigderson IAS, Princeton. Cryptography. Applications : Analyzed on perfect randomness. Probabilistic algorithms. Game Theory. Unbiased, independent. biased, dependent. Reality : Sources of imperfect randomness.

nansen
Download Presentation

Extractors: applications and constructions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Randomness Extractors: applications and constructions Avi Wigderson IAS, Princeton

  2. Cryptography Applications: Analyzed on perfect randomness Probabilistic algorithms Game Theory Unbiased, independent biased, dependent Reality: Sources of imperfect randomness Stock market fluctuations Radioactive decay Sun spots Extractors: original motivation Extractor Theory

  3. Applications of Extractors • Using weak random sources in prob algorithms [B84,SV84,V85,VV85,CG85,V87,CW89,Z90-91] • Randomness-efficient error reduction of prob algorithms [Sip88, GZ97, MV99,STV99] • Derandomization of space-bounded algorithms [NZ93, INW94, RR99, GW02] • Distributed Algorithms [WZ95, Zuc97, RZ98, Ind02]. • Hardness of Approximation [Zuc93, Uma99, MU01] • Cryptography[CDHKS00, MW00, Lu02 Vad03] • Data Structures[Ta02]

  4. Unifying Role of Extractors Extractors are intimately related to: • Hash Functions [ILL89,SZ94,GW94] • Expander Graphs [NZ93, WZ93, GW94, RVW00, TUZ01,CRVW02] • Samplers[G97, Z97] • Pseudorandom Generators [Trevisan 99, …] • Error-Correcting Codes [T99, TZ01, TZS01, SU01, U02]  Unify the theory of pseudorandomness.

  5. Definitions

  6. Weak random sources Distributions X on {0,1}n with some entropy: • [vN] sources: ncoins of unknown fixed bias • [SV] sources: Pr[Xi+1 =1|X1=b1,…,Xi=bi]  (δ, 1-δ) • Bit fixing: ncoins, some good, some “sticky” • ….. • [Z] k-sources: H∞(X) ≥ k x Pr[X = x]  2-k e.g X uniform with support 2k {0,1}n X

  7. Randomness Extractors(1st attempt) Impossible even if k=n-1 and m=1 “weak” random source X k can be e.g n/2, √n, log n,… X k-source of length n EXT {0,1}n Ext=0 Ext=1 malmost-uniform bits X

  8. (short) “seed” d random bits Extractors [Nisan & Zuckerman `93] X k-source of length n • Ext : {0,1}n x {0,1}d {0,1}m • X has min-entropy k ( X is a k-source) • m ≤ k+d EXT malmost-uniform bits

  9. (short) “seed” d random bits Extractors [Nisan & Zuckerman `93] X k-source of length n  k-source X, | Ext(X,Ud) – Um|1 <   but -fraction of y’s, | Ext(X, y) – Um|1 <  {0,1}n EXT y {0,1}d Ext(X,y) m bits -close to uniform {0,1}m

  10. {0,1}n {0,1}m Ext(x,y) k-source X |X|=2k (X) x y B Extractors as graphs (k,)-extractor Ext: {0,1}n {0,1}d{0,1}m Sampling Hashing Amplification Coding Expanders …  Discrepancy: For all but 2k of the x {0,1}n, | |(X)  B|/2d-|B|/2m |< 

  11. d random bits Probabilistic algorithms with weak random bits k-source of length n Where from? EXT Efficient? Try all possible 2d strings. Take Majority vote m randombits (upto ) Input Probabilistic algorithm Output + Error prob <δ Want: efficient Ext, small d,  , large m

  12. Extractors - Parameters k-source of length n • Goals: minimize d, , maximize m. • Non-constructive & optimal [Sip88,NZ93,RT97]: • Seed length d = log(n-k) + 2 log 1/ + O(1). • Output length m = k + d - 2 log 1/ - O(1). (short) “seed” EXT d random bits m bits -close to uniform

  13. Extractors - Parameters k-source of length n • Goals: minimize d, maximize m. • Non-constructive & optimal [Sip88,NZ93,RT97]: • Seed length d = log n + O(1). • Output length m = k + d - O(1). (short) “seed” EXT d random bits m bits -close to uniform •  = 0.01 • k n/2

  14. Explicit Constructions Non-constructive & optimal [Sip88,NZ93,RT97]: • Seed length d = log n + O(1). • Output length m = k + d - O(1). [...B86,SV86,CG87, NZ93, WZ93, GW94, SZ94, SSZ95, Zuc96, Ta96, Ta98, Tre99, RRV99a, RRV99b, ISW00, RSW00, RVW00, TUZ01, TZS01, SU01, LRVW03,…] New explicit constructions [GUV07, DW08] - Seed length d = O(log n) [even for =1/n] • Output length m = .99k + d

  15. Applications

  16. d random bits Probabilistic algorithms with weak random bits k-source of length n X EXT Efficient! Try all 2d = poly(n) strings. Take Majority vote m randombits (upto ) Input Probabilistic algorithm Output + Error prob <δ The error set B {0,1}m of alg is sampled accurately whp

  17. Extractors as samplers n-bit string x Efficient! k=2m EXT S(x)={ } Ext(X,1) Ext(X,2) Ext(X,nc) m m m For every B  {0,1}m, all but 2k of x  {0,1}n : | |S(x)  B|/nc-|B|/2m |<  Note: x bad with prob < 2k/2n, n arbitrary

  18. Extractors as list-decodable error-correcting codes [TZ] {0,1}D c2 c1 C: {0,1}n {0,1}D d = c log n D =2d = nc c7 c3 z c8 c6 EXT c5 Polynomial rate! Efficient encoding!! Efficient decoding? c9 n-bit string x c4 C(x)= ……… Ext(X,D) Ext(X,1) Ext(X,2) For z  {0,1}D let Bz {0,1}d+1 be the set {(i,zi) : i [D] } List decoding: For every z, at most D2 of x have C(x) fall in (1/2 -)D hamming ball around z 1 bit 1 bit 1 bit

  19. Beating e-value expansion Task:Construct an graph on [N] of minimal degree DEG s.t. every two sets of size Kare connected by an edge. N Any such graph: DEG> N/K Ramanujan graphs: DEG<(N/K)2 Random graphs: DEG < (N/K)1+o(1) Extractors: DEG < (N/K)1+o(1) K linear in N and constant DEG[RVW] We’ll see it for “moderate” K [WZ] K K

  20. |X|=K |X’|=K Extractors as graphs (again) (k,.01)-extractor Ext: {0,1}n {0,1}d{0,1}m 2k= K = M1+o(1)Ext: [N] x [D]  [M] 2d = D < Mo(1) [N] [M] Take G = Ext2 on [N] DEG < (N/K)1+o(1) Many edges between any two K-sets X,X’ |(X)| > .99M

  21. Constructions

  22. Bx {0,1}m random strings G explicit expander of const degree rt r  r1 x x x Alg Alg Alg Expanders as extractors Pr[error] < 1/3 Majority Thm [Chernoff] r1 r2….rt independent(tm random bits) Thm [AKS] r1 r2….rt random G-path (m+ O(t) random bits) then Pr[error] = Pr[|{r1 r2…. rt }Bx}| > t/2] < exp(-t)

  23. Expanders as extractors (k large) G expander graph of const degree on {0,1}m B any subset, δ=|B|/2m S = {r1 r2….rt} arandom G-path (n = m+ O(t) bits) Thm [G]Pr[| δ -|SB|/t| > ] < exp(-2t) Thm [Z]t=cm=2d, Ext : {0,1}n x {0,1}d {0,1}m Ext(r1 r2….rt ; i) = ri is an (k=.99n, )–extractor of d=O(log n) seed

  24. seed d random bits Condensers [RR99,RSW00,TUZ01] X k-source of length n Sufficient to construct such condensers: from here we can use [Z] extractor Con .99k-source of length k

  25. seed d random bits Mergers [T96] k k … k n=ks X= X1 X2 … XS Some block Xi is random. The other Xj are correlated arbitrarily with it. Mer outputs a high entropy distribution. Mer .9k-source k

  26. seed d random bits Mergers [T96] k k … k n=ks X= X1 X2 … XS XiFqkq ~ n100 Some Xi is random Mer .9k-source [LRVW] Mer = a1X1+a2X2+…+asXsaiFq ( d=slog q ) Mer is a random element in the subspace spanned by Xi’s [D] It works! (proof of the Wolf conjecture). [DW] Mer = a1(y)X1+a2(y)X2+…+as(y)XsyFq ( d=log q ) Mer is a random element in the curve through the Xi’s k

  27. The proof Deg(C) = s-1 (Fq)k B B C(x) x1 x1 xi xi x2 x2 xs xs Mer(x) Assume: E [|C(X)  B|] > 2ε & B small Prx[ |C(x)  B|>ε ] >ε low deg Q:(Fq)k  Fq Q(B) 0  Prx[ Q(C(x))  0] >ε  Pr[ Q(xi)  0] >ε Q 0 #

  28. |X|=N Open Problems Find explicit extractors with • Seed length d = log n + O(1). • Output length m = k + d - O(1). Find explicit bipartite graph, of constant deg [N3] [N2] |Γ(X)|≥ N

  29. d random bits Extractors as samplers Given B  {0,1}m Estimate |B|/2m X k-source of length n EXT Efficient! m randombits (upto ) Try all 2d = poly(n) strings. Count the fraction falls in B Any set B  {0,1}m WHP estimation error <

More Related