1 / 29

Information Complexity: an Overview

Information Complexity: an Overview. Rotem Oshman, Princeton CCI Based on work by Braverman , Barak, Chen, Rao, and others Charles River Science of Information Day 2014. Classical Information Theory. Shannon ‘48, A Mathematical Theory of Communication :.

meris
Download Presentation

Information Complexity: an Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Complexity: an Overview Rotem Oshman, Princeton CCI Based on work by Braverman, Barak, Chen, Rao, and others Charles River Science of Information Day 2014

  2. Classical Information Theory • Shannon ‘48, A Mathematical Theory of Communication:

  3. Motivation: Communication Complexity = ? Yao ‘79, “Some complexity questions related to distributive computing”

  4. Motivation: Communication Complexity More generally: solve some task Yao ‘79, “Some complexity questions related to distributive computing”

  5. Motivation: Communication Complexity • Applications: • Circuit complexity • Streaming algorithms • Data structures • Distributed computing • Property testing • …

  6. Example: Streaming Lower Bounds • Streaming algorithm: • Reduction from communication complexity [AMS’97] How much space is required to approximate f(data)? algorithm data

  7. Example: Streaming Lower Bounds State of the algorithm • Streaming algorithm: • Reduction from communication complexity [Alon, Matias, Szegedy ’99] algorithm data

  8. Advances in Communication Complexity • Very successful in proving unconditional lower bounds, e.g., • for set disjointness[KS’92, Razborov ‘92] • for gap hamming distance [Chakrabarti, Regev ‘10] • But stuck on some hard questions • Multi-party communication complexity • Karchmer-Wigderson games • [Chakrabarty, Shi, Wirth, Yao ’01], [Bar-Yossef, Kumar, Jayram, Srivakumar ‘04]: use tools from information theory

  9. Extending Information Theory to Interactive Computation • One-way communication: • Task: send across the channel • Cost: bits • Shannon: in the limit over many instances • Huffman: bits for one instance • Interactive computation: • Task: e.g., compute • Cost?

  10. Information Cost • Reminder: mutual information • Conditional mutual information: • Basic properties: • and • Chain rule:

  11. Information Cost • Fix a protocol • Notation abuse: let also denote the transcript of the protocol • Two ways to measure information cost: • External information cost: • Internal information cost: • Cost of a task: infimum over all protocols • Which cost is “the right one”?

  12. Information Cost: Basic Properties External information: Internal information: • Internal external • Can be much smaller, e.g.: • uniform over • Alice sends to Bob • But equal if inependent

  13. Information Cost: Basic Properties External information: Internal information: • External information communication:

  14. Information Cost: Basic Properties • Internal information communication cost: • By induction: let . • : what we know after r rounds what we knew after r-1 rounds what we learn in round r, given what we already know I.H.

  15. Information vs. Communication • Want: • Suppose is sent by Alice. • What does Alice learn? • is a function of and so • What does Bob learn?

  16. Information vs. Communication • We have: Internal information communication External information communication Internal information external information

  17. Information vs. Communication • “Information cost = communication cost”? • In the limit: internal information! [Braverman, Rao ‘10] • For one instance: external information! [Braverman, Barak, Rao, Chen ‘10] Big question: can protocols be compressed down to their internal information cost? • [Ganor, Kol, Raz ’14]: no! • There is a task with internal IC=, CC=. … but: remains open for functions, small output.

  18. Information vs. Amortized Communication • Theorem [Braverman, Rao ‘10]: • The “” direction: compression • The “” direction: direct sum • We know: • We can show:

  19. Direct Sum Theorem [BR‘10] • Let be a protocol for on -copy inputs • Construct for as follows: • Alice and Bob get inputs • Choose a random coordinate , set • Bad idea: publicly sample

  20. Direct Sum Theorem [BR‘10] • Let be a protocol for on -copy inputs • Construct for as follows: • Alice and Bob get inputs • Choose a random coordinate , set • Bad idea: publicly sample Suppose in , Alice sends . In , Bob learns one bit in he should learn bit But if is public Bob learns 1 bit about !

  21. Direct Sum Theorem [BR‘10] • Let be a protocol for on -copy inputs • Construct for as follows: • Alice and Bob get inputs • Choose a random coordinate , set Publicly sample Privately sample Privately sample Publicly sample

  22. Compression • What we know: a protocol with communication , internal info and external info can be compressed to • [BBCR’10] • [BBCR’10] • [Braverman’10] • Major open question:can we compress to [GKR, partial answer: no]

  23. Using Information Complexity to Prove Communication Lower Bounds • Internal/external info communication • Essentially the most powerful technique known [Kerenidis,Laplante,Lerays,Roland,Xiao’12]: most lower bound techniques imply IC lower bounds • Disadvantage: hard to show incompressibility! • Must exhibit problem with low IC, high CC • But proving high CC usually proves high IC…

  24. Extending IC to Multiple Players • Recent interest in multi-player number-in-hand communication complexity • Motivated by “big data”: • Streaming and sketching, e.g., [Woodruff, Zhang ‘11,’12,’13] • Distributed learning, e.g., [Awasthi, Balcan, Long ‘14]

  25. Extending IC to Multiple Players • Multi-player computation traditionally hard to analyze • [Braverman,Ellen,O.,Pitassi,Vaikuntanathan]: for Set Disjointness with elements, players, private channels, NIH input

  26. Information Complexity on Private Channels • First obstacle: secure multi-party computation • [Goldreich,Micali,Wigderson’87]: any function can be computed with perfect information-theoretic security against players • Solution: redefine information cost, measure both • Information a player learns, and • Information a player leaks to all the others.

  27. Extending IC to Multiple Players • Set disjointness: • Input: • Output: • Open problem: can we extend to gap set disjointness? • First step: “purely info-theoretic” 2-party analysis

  28. Extending IC to Multiple Players • In [Braverman,Ellen,O.,Pitassi,Vaikuntanathan] we show direct sum for multi-party • Solving instances = solving one instance • Does direct sum hold “across players”? • Solving with players = solving with 2 players? • Not always • Does compression work for multi-party?

  29. Conclusion • Information complexity extends classical information theory to the interactive setting • Picture is much less well-understood • Powerful tool for lower bounds • Fascinating open problems: • Compression • Information complexity for multi-player computation, quantum communication, …

More Related