1 / 53

Church, Kolmogorov and von Neumann: Their Legacy Lives in Complexity

Church, Kolmogorov and von Neumann: Their Legacy Lives in Complexity. Lance Fortnow NEC Laboratories America. 1903 – A Year to Remember. 1903 – A Year to Remember. 1903 – A Year to Remember. Kolmogorov. Church. von Neumann. Andrey Nikolaevich Kolmogorov.

steve
Download Presentation

Church, Kolmogorov and von Neumann: Their Legacy Lives in Complexity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Church, Kolmogorov and von Neumann:Their Legacy Lives in Complexity Lance Fortnow NEC Laboratories America

  2. 1903 – A Year to Remember

  3. 1903 – A Year to Remember

  4. 1903 – A Year to Remember Kolmogorov Church von Neumann

  5. Andrey Nikolaevich Kolmogorov • Born:April 25, 1903Tambov, Russia • Died:Oct. 20, 1987

  6. Alonzo Church • Born:June 14, 1903Washington, DC • Died:August 11, 1995

  7. John von Neumann • Born:Dec. 28, 1903Budapest, Hungary • Died:Feb. 8, 1957

  8. Frank Plumpton Ramsey • Born:Feb. 22, 1903Cambridge, England • Died:January 19, 1930 • Founder of Ramsey Theory

  9. Ramsey Theory

  10. Ramsey Theory

  11. Applications of Ramsey Theory • Logic • Concrete Complexity • Complexity Classes • Parallelism • Algorithms • Computational Geometry

  12. John von Neumann • Quantum • Logic • Game Theory • Ergodic Theory • Hydrodynamics • Cellular Automata • Computers

  13. The Minimax Theorem (1928) • Every finite zero-sum two-person game has optimal mixed strategies. • Let A be the payoff matrix for a player.

  14. The Yao Principle (Yao, 1977) • Worst case expected runtime of randomized algorithm for any input equals best case running time of a deterministic algorithm for worst distribution of inputs. • Invaluable for proving limitations of probabilistic algorithms.

  15. Making a Biased Coin Unbiased • Given a coin with an unknown bias p, how do we get an unbiased coin flip?

  16. Making a Biased Coin Unbiased • Given a coin with an unknown bias p, how do we get an unbiased coin flip? HEADS TAILS Flip Again or

  17. Making a Biased Coin Unbiased • Given a coin with an unknown bias p, how do we get an unbiased coin flip? HEADS p(1-p) TAILS (1-p)p Flip Again or

  18. Weak Random Sources • Von Neumann’s coin flipping trick (1951) was the first to get true randomness from a weak random source. • Much research in TCS in 1980’s and 90’s to handle weaker dependent sources. • Led to development of extractors and connections to pseudorandom generators.

  19. Alonzo Church • Lambda Calculus • Church’s Theorem • No decision procedure for arithmetic. • Church-Turing Thesis • Everything that is computable is computable by the lambda calculus.

  20. The Lambda Calculus • Alonzo Church 1930’s • A simple way to define and manipulate functions. • Has full computational power. • Basis of functional programming languages like Lisp, Haskell, ML.

  21. Lambda Terms • x • xy • lx.xx • Function Mapping x to xx • lxy.yx • Really lx(ly(yx)) • lxyz.yzx(luv.vu)

  22. Basic Rules • a-conversion • lx.xx equivalent to ly.yy • b-reduction • lx.xx(z) equivalent to zz • Some rules for appropriate restrictions on name clashes • (lx.(ly.yx))y should not be same as ly.yy

  23. Normal Forms • A l-expression is in normal form if one cannot apply any b-reductions. • Church-Rosser Theorem (1936) • If a l-expression M reduces to both A and B then there must be a C such that A reduces to C and B reduces to C. • If M reduces to A and B with A and B in normal form, then A = B.

  24. Power of l-Calculus • Church (1936) showed that it is impossible in the l-calculus to decide whether a term M has a normal form. • Church’s Thesis • Expressed as a Definition • An effectively calculable function of the positive integers is a l-definable function of the positive integers.

  25. Computational Power • Kleene-Church (1936) • Computing Normal Forms has equivalent power to the recursive functions of Turing machines. • Church-Turing Thesis • Everything computable is computable by a Turing machine.

  26. Andrei Nikolaevich Kolmogorov • Measure Theory • Probability • Analysis • Intuitionistic Logic • Cohomology • Dynamical Systems • Hydrodynamics

  27. Kolmogorov Complexity • A way to measure the amount of information in a string by the size of the smallest program generating that string.

  28. Incompressibility Method • For all n there is an x, |x| = n, K(x)  n. • Such x are called random. • Use to prove lower bounds on various combinatorical and computational objects. • Assume no lower bound. • Choose random x. • Get contradiction by givinga short program for x.

  29. Incompressibility Method • Ramsey Theory/Combinatorics • Oracles • Turing Machine Complexity • Number Theory • Circuit Complexity • Communication Complexity • Average-Case Lower Bounds

  30. Complexity Uses of K-Complexity • Li-Vitanyi ’92: For Universal Distributions Average Case = Worst Case • Instance Complexity • Universal Search • Time-Bounded Universal Distributions • Kolmogorov characterizations of computational complexity classes.

  31. Rest of This Talk • Measuring sizes of sets using Kolmogorov Complexity • Computational Depth to measure the amount of useful information in a string.

  32. Measuring Sizes of Sets • How can we use Kolmogorov complexity to measure the sizeof a set?

  33. Measuring Sizes of Sets • How can we use Kolmogorov complexity to measure the sizeof a set? Strings of length n

  34. Measuring Sizes of Sets • How can we use Kolmogorov complexity to measure the sizeof a set? An Strings of length n

  35. Measuring Sizes of Sets • How can we use Kolmogorov complexity to measure the sizeof a set? • The string in An of highest Kolmogorov complexity tells us about |An|. An Strings of length n

  36. Measuring Sizes of Sets • There must be a string x in An such that K(x) ≥ log |An|. • Simple counting argument, otherwise not enough programs for all elements of An. An Strings of length n

  37. Measuring Sizes of Sets • If A is computable, or even computably enumerable then every string in An hasK(x) ≤ log |An|. • Describe x by A and index of x in enumeration of strings of An. An Strings of length n

  38. Measuring Sizes of Sets • If A is computable enumerable then An Strings of length n

  39. Measuring Sizes of Sets in P • What if A is efficiently computable? • Do we have a clean way to characterize the size of A using time-bounded Kolmogorov complexity? An Strings of length n

  40. Time-Bounded Complexity • Idea: A short description is only useful if we can reconstruct the string in a reasonable amount of time.

  41. Measuring Sizes of Sets in P • It is still the case that some element x in An has Kpoly(x) ≥ log |A|. • Very possible that there are small A with x in A with Kpoly(x) quite large. An Strings of length n

  42. Measuring Sizes of Sets in P • Might be easier to recognize elements in A than generate them. An Strings of length n

  43. Distinguishing Complexity • Instead of generating the string, we just need to distinguish it from other strings.

  44. Measuring Sizes of Sets in P • Ideally would like • True if P = NP. • Problem: Need to distinguish all pairs of elements in An An Strings of length n

  45. Measuring Sizes of Sets in P • Intuitively we need • Buhrman-Laplante-Miltersen (2000) prove this lower bound in black-box model.

  46. Measuring Sizes of Sets in P • Buhrman-Fortnow-Laplante (2002) show • We have a rough approximation of size

  47. Measuring Sizes of Sets in P • Sipser 1983: Allowing randomness gives a cleaner connection. • Sipser used this and similar results to show how to simulate randomness by alternation.

  48. Useful Information • Simple strings convey small amount of information. • 00000000000000000000000000000000 • Random string have lots of information • 00100011100010001010101011100010 • Random strings are not that useful because we can generate random strings easily.

  49. Logical Depth • Chaitin ’87/Bennett ’97 • Roughly the amount of time needed to produce a string x from a program p whose length is close to the length of the shortest program for x.

  50. Computational Depth • Antunes, Fortnow, Variyam and van Melkebeek 2001 • Use the difference of two Kolmogorov measures. • Deptht(x) = Kt(x) – K(x) • Closely related to “randomness deficiency” notion of Levin (1984).

More Related