Information Complexity: an Overview Rotem Oshman, Princeton CCI Based on work by Braverman, Barak, Chen, Rao, and others Charles River Science of Information Day 2014 Classical Information Theory β’ Shannon β48, A Mathematical Theory of Communication: Motivation: Communication Complexity π π, π = ? π π Yao β79, βSome complexity questions related to distributive computingβ Motivation: Communication Complexity More generally: solve some task π(π, π) π π Yao β79, βSome complexity questions related to distributive computingβ Motivation: Communication Complexity β’ Applications: β Circuit complexity β Streaming algorithms β Data structures β Distributed computing β Property testing ββ¦ Example: Streaming Lower Bounds β’ Streaming algorithm: How much space is required to approximate f(data)? algorithm data β’ Reduction from communication complexity [AMSβ97] Example: Streaming Lower Bounds β’ Streaming algorithm: State of the algorithm algorithm data β’ Reduction from communication complexity [Alon, Matias, Szegedy β99] Advances in Communication Complexity β’ Very successful in proving unconditional lower bounds, e.g., β Ξ© π for set disjointness [KSβ92, Razborov β92] β Ξ© π for gap hamming distance [Chakrabarti, Regev β10] β’ But stuck on some hard questions β Multi-party communication complexity β Karchmer-Wigderson games β’ [Chakrabarty, Shi, Wirth, Yao β01], [Bar-Yossef, Kumar, Jayram, Srivakumar β04]: use tools from information theory Extending Information Theory to Interactive Computation β’ One-way communication: β Task: send π across the channel β Cost: π» π bits β’ Shannon: in the limit over many instances β’ Huffman: π» π + 1 bits for one instance β’ Interactive computation: β Task: e.g., compute π π, π β Cost? Information Cost β’ Reminder: mutual information πΌ π; π = π» π β π» π π = π» π β π» π π β’ Conditional mutual information: πΌ π; π π = π» π π β π» π π, π = πΈπ§ πΌ π; π π = π§ β’ Basic properties: β πΌ π; π β₯ 0 β πΌ π; π β€ π» π and πΌ π; π β€ π» π β Chain rule: πΌ ππ; π = πΌ π; π + πΌ π; π π Information Cost β’ Fix a protocol Ξ β’ Notation abuse: let Ξ also denote the transcript of the protocol β’ Two ways to measure information cost: β External information cost: πΌ Ξ ; ππ β Internal information cost: πΌ Ξ ; π π + πΌ Ξ ; π π β Cost of a task: infimum over all protocols β Which cost is βthe right oneβ? Information Cost: Basic Properties External information: πΌ Ξ ; ππ Internal information: πΌ Ξ ; π π + πΌ Ξ ; π π β’ Internal β€ external β’ Can be much smaller, e.g.: β π = π uniform over 0,1 β Ξ : Alice sends π to Bob π β’ But equal if π, π inependent Information Cost: Basic Properties External information: πΌ Ξ ; ππ Internal information: πΌ Ξ ; π π + πΌ Ξ ; π π β’ External information β€ communication: πΌ Ξ ; ππ β€ π» Ξ β€ Ξ . Information Cost: Basic Properties β’ Internal information β€ communication cost: πΌ Ξ ; π π + πΌ(Ξ ; π|π) β€ Ξ . β’ By induction: let Ξ = Ξ 1 β¦ Ξ π‘ . β’ βπ β€ π‘ : πΌ Ξ β€π ; π π + πΌ Ξ β€π ; π π β€ π. πΌ Ξ β€π ; π π + πΌ Ξ β€π ; π π what we know after r rounds = πΌ Ξ <π ; π π + πΌ Ξ <π ; π π what we knew after r-1 rounds I.H. β€ π β 1 + πΌ Ξ π ; Y X, Ξ <π + πΌ Ξ π ; X Y, Ξ <π what we learn in round r, given what we already know Information vs. Communication β’ Want: πΌ Ξ π ; Y X, Ξ <π + πΌ Ξ π ; X Y, Ξ <π β€ 1. β’ Suppose Ξ π is sent by Alice. β’ What does Alice learn? β Ξ π is a function of Ξ <π and π, so πΌ Ξ π ; Y X, Ξ <π = 0. β’ What does Bob learn? β πΌ Ξ π ; Y X, Ξ <π β€ Ξ π = 1. Information vs. Communication β’ We have: Internal information β€ communication External information β€ communication Internal information β€ external information Information vs. Communication β’ βInformation cost = communication costβ? β In the limit: internal information! [Braverman, Rao β10] β For one instance: external information! [Braverman, Barak, Rao, Chen β10] Big question: can protocols be compressed down to their internal information cost? β [Ganor, Kol, Raz β14]: no! β There is a task with internal IC=π, CC=2π . β¦ but: remains open for functions, small output. Information vs. Amortized Communication β’ Theorem [Braverman, Rao β10]: β’ β’ β’ β’ πΆπΆ(πΉ π , ππ , π) lim = πΌπΆ πΉ, π, π . πββ π The ββ€β direction: compression The ββ₯β direction: direct sum We know: πΆπΆ πΉ π , ππ , π β₯ πΌπΆ πΉ π , ππ , π We can show: πΌπΆ πΉ π , ππ , π = π β πΌπΆ πΉ, π, π Direct Sum Theorem [BRβ10] πΌπΆ πΉ π , ππ , π = π β πΌπΆ πΉ, π, π : β’ Let Ξ be a protocol for πΉ π on π-copy inputs π, π β’ Construct Ξ β² for πΉ as follows: β Alice and Bob get inputs π, π β Choose a random coordinate π β π , set ππ = π, ππ = π β Bad idea: publicly sample πβπ , πβπ π π π π Direct Sum Theorem [BRβ10] πΌπΆ πΉ π , ππ , π = π β πΌπΆ πΉ, π, π : β’ Let Ξ be a protocol for πΉ π on π-copy inputs π, π β’ Construct Ξ β² for πΉ as follows: β Alice and Bob get inputs π, π β Choose a random coordinate π β π , set ππ = π, ππ = π β Bad idea: publicly sample πβπ , πβπ Suppose in Ξ , Alice sends π1 β β― β ππ . In Ξ , Bob learns one bit β in Ξ β² he should learn 1/π bit But if πβπ is public Bob learns 1 bit about π! Direct Sum Theorem [BRβ10] πΌπΆ πΉ π , ππ , π = π β πΌπΆ πΉ, π, π : β’ Let Ξ be a protocol for πΉ π on π-copy inputs π, π β’ Construct Ξ β² for πΉ as follows: β Alice and Bob get inputs π, π β Choose a random coordinate π β π , set ππ = π, ππ = π Publicly sample π1 , β¦ , ππβ1 π Privately sample π(π+1) , β¦ , ππ π Privately sample π1 , β¦ , ππβ1 π Publicly sample π π+1 , β¦ , ππ π Compression β’ What we know: a protocol with communication πΆ, internal info πΌ and external info πΌππ₯π‘ can be compressed to β πΌππ₯π‘ β polylog πΆ [BBCRβ10] β πΌ β πΆ β polylog πΆ [BBCRβ10] β 2π πΌ [Bravermanβ10] β’ Major open question: can we compress to πΌ β polylog πΆ ? [GKR, partial answer: no] Using Information Complexity to Prove Communication Lower Bounds β’ Internal/external info β€ communication β’ Essentially the most powerful technique known [Kerenidis,Laplante,Lerays,Roland,Xiaoβ12]: most lower bound techniques imply IC lower bounds β’ Disadvantage: hard to show incompressibility! β Must exhibit problem with low IC, high CC β But proving high CC usually proves high ICβ¦ Extending IC to Multiple Players β’ Recent interest in multi-player number-inhand communication complexity β’ Motivated by βbig dataβ: β Streaming and sketching, e.g., [Woodruff, Zhang β11,β12,β13] β Distributed learning, e.g., [Awasthi, Balcan, Long β14] Extending IC to Multiple Players β’ Multi-player computation traditionally hard to analyze β’ [Braverman,Ellen,O.,Pitassi,Vaikuntanathan]: Ξ© ππ for Set Disjointness with π elements, π players, private channels, NIH input Information Complexity on Private Channels β’ First obstacle: secure multi-party computation β’ [Goldreich,Micali,Wigdersonβ87]: any function can be computed with perfect information-theoretic security against < π/2 players β Solution: redefine information cost, measure both β’ Information a player learns, and β’ Information a player leaks to all the others. Extending IC to Multiple Players β’ Set disjointness: β Input: π1 , β¦ , ππ β Output: π1 β© β― β© ππ = β ? β’ Open problem: can we extend to gap set disjointness? β First step: βpurely info-theoreticβ 2-party analysis Extending IC to Multiple Players β’ In [Braverman,Ellen,O.,Pitassi,Vaikuntanathan] we show direct sum for multi-party β Solving π instances = π β solving one instance β’ Does direct sum hold βacross playersβ? β Solving with π players = Ξ© π β solving with 2 players? β Not always β’ Does compression work for multi-party? Conclusion β’ Information complexity extends classical information theory to the interactive setting β’ Picture is much less well-understood β’ Powerful tool for lower bounds β’ Fascinating open problems: β Compression β Information complexity for multi-player computation, quantum communication, β¦
© Copyright 2026 Paperzz