Quantum Information Complexity Dave Touchette University of Waterloo, Perimeter Institute, Universit´e de Montr´eal
Beyond iid in information theory, BIRS, Banff, 2015
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Interactive Quantum Communication Communication complexity setting:
Input: x
Bob
μ
Alice TA
|Ψ
TB
Input: y
m1 m2 m3
... mM Output: f(x, y)
Output: f(x, y)
Information-theoretic view: quantum information complexity I
How much quantum information to compute f on µ
Information content of interactive quantum protocols?
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Unidirectional Classical Communication
Separate into 2 prominent communication problems I I
Compress messages with ”low information content” Transmit messages ”noiselessly” over noisy channels
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Unidirectional Classical Communication
Separate into 2 prominent communication problems I I
Compress messages with ”low information content” Transmit messages ”noiselessly” over noisy channels
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Information Theory How to quantify information? Shannon’s entropy! Source X ofPdistribution pX has entropy H(X ) = − x pX (x) log(pX (x)) bits Operational significance: optimal asymptotic rate of compression for i.i.d. copies of source X :
X
xt...x2x1
One-shot, average length: Huffman encoding ≤ H(X ) + 1 Derived quantities: conditional entropy H(X |Y ), mutual information I (X : Y ), conditional mutual information I (X : Y |Z ) ...
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Interactive Classical Communication Communication complexity of tasks, e.g. bipartite functions
Input: x
Bob
μ
Alice RA
R
RB
SA
Input: y SB
m1 m2 m3
... mM Output: f(x, y)
Output: f(x, y)
m1 = f1 (x, r , sA ), m2 = f2 (y , m1 , r , sB ), m3 = f3 (x, m1 , m2 , r , sB ), · · · Protocol transcript Π(x, y , r , s) = m1 m2 · · · mM Classical protocols: Π memorizes whole history CC (f , µ, ) = minΠ CC (Π) CC (Π) = |m1 | + |m2 | + · · · + |mM |
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Coding for Interactive Protocols
Protocol compression I
Can we compress protocols that ”do not convey much information” F F
I
For many copies run in parallel? For a single copy?
What is the amount of information conveyed by a protocol? F F
Total amount of information at end of protocol? Optimal asymptotic compression rate?
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Protocol Compression: Classical Information Complexity
Information complexity: IC (f , µ, ) = inf Π IC (Π, µ) Information cost: IC (Π, µ) = I (X : Π|Y ) + I (Y : Π|X ) I
Amount of information each party learns about the other’s input from the final transcript
Important properties: I I I
I I
Additivity: IC (T1 ⊗ T2 ) = IC (T1 ) + IC (T2 ) Lower bounds communication: IC (T ) ≤ CC (T ) Operational interpretation: IC (T ) = ACC (T ) = limn→∞ n1 CC (T ⊗n ) [BR11] Direct sum on composite functions, e.g. DISJn from AND Convexity, Concavity, Continuity, etc.
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Applications of Classical IC I Direct sum: CC ((f , )⊗n ) ≥ Ω(n · CC (f , ))? [BBCR10, BR11, . . . ] Remember IC (f , ) = limn→∞ n1 CC ((f , )⊗n ) I
Direct sum related to one-shot compression down to IC
T
Tn
T ≈
... (n times) T
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Applications of Classical IC I
Direct sum: CC ((f , )⊗n ) ≥ Ω(n · CC (f , ))? [BBCR10, BR11, . . . ] Remember IC (f , ) = limn→∞ n1 CC ((f , )⊗n ) I
Direct sum related to one-shot compression down to IC
√ ˜ CC · IC ) BBCR10 : can compress to O( I I
˜ on product distributions: compress down to O(IC ) must compress simultaneously multiple rounds of low information
BR11 : can compress to O(IC + r ) for r rounds I I I I
One-shot, average length version of S-W Interactive protocol H(X |Y )+ lower order terms I (X : M|Y ) + · · · , for message M generated from X
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Applications of Classical IC II Exact communication complexity bound!! [BGPW13] I
E.g. CC (DISJn ) = 0.4827 · n ± o(n)
IC0 (Disjn ) = n · IC0 (AND) IC0r (AND) = 0.4827 + θ( r12 ) I I I
IC0 (AND) = limr →∞ IC0r (AND) Infinite rounds necessary to attain IC Infimum over protocol necessary
x1 AND y1
DISJn = ¬OR(xi AND yi)
x2 AND y2
≈
... xn AND yn
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Quantum Information Complexity ?
Can we define a sensible notion of quantum information complexity? Can we obtain similar applications for it?
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Quantum Information Theory
von Neumann’s quantum entropy: H(A)ρ = −Tr (ρA log ρA ) = H(λi ) P for ρA = i λi |iihi| Characterizes optimal rate for quantum source compression Derived quantities defined in formal analogy to classical quantities Conditional entropy can be negative!
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Quantum Communication Complexity 2 Models for computing classical f : X × Y → Z
Yao
Cleve-Buhrman x
x
x
x
M1
M3
A
|0
C U1
|0
Alice Bob |0
U3 C
C
B
...
|Ψ U2
0
A
Alice Bob
C
B
Quantum Communication
...
M2
y
No Entanglement
C
Entanglement
y Classical Communication
Hybrid: arbitrary pre-shared entanglement ψ, quantum messages mi Exponential separations in communication complexity I I
Classical vs. quantum N-rounds vs. N+1-rounds
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Interactive Quantum Communication and QIC
Input: x
Bob
μ
Alice TA
|Ψ
TB
Input: y
m1 m2 m3
... mM Output: f(x, y)
Output: f(x, y)
Recall classically: IC (Π, µ) = I (X : Π|Y ) + I (Y : Π|X ) I
Π = m1 m2 · · · mM
Potential definition for quantum information cost: QIC (Π, µ) = I (X : m1 m2 · · · mM |Y ) + I (Y : m1 m2 · · · mM |X )? No!!
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Problems
Bad QIC (Π, µ) = I (X : m1 m2 · · · mM |Y ) + I (Y : m1 · · · |X ) Many problems Yao model: I I
I
No-cloning theorem : cannot copy mi , no transcript Can only evaluate information quantities on registers defined at same moment in time Not even well-defined!
Cleve-Buhrman model: I I I
mi ’s could be completely uncorrelated to inputs e.g. teleportation at each time step Corresponding quantum information complexity is trivial
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Potential Solutions
1) Keep as much information as possible, and measure final correlations, as in classical information cost I
I
Problem : Reversible protocols, no garbage, only additional information is the function output Corresponding quantum information complexity is trivial
2) Measure correlations at each step [JRS03, JN14] I I
I
P
P I (X : mi Bi−1 |Y ) + ieven I (Y : mi Ai−1 |X ) Problem: for M messages and total communication C , could be Ω(M · C ) We want QIC ≤ QCC , independent of M, iodd
F
i.e. direct lower bound on communication
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Approach: Reinterpret Classical Information Cost sA XRM1
X
XRM1M2M3
M1
Alice
M3
M1
M3
...
M2
R
Bob M3
YRM1M2
Y sB
Shannon task: simulate noiseless channel over noisy channel Reverse Shannon task: simulate noisy channel over noiseless channel
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Channel simulations
channel M|I for input I , output/message M, side information S Known asymptotic cost : limn→∞ n1 log |Cn | = I (I : M|S) Sum of asymptotic channel simulation costs: good operational measure of information Rewrite IC (Π, µ) = I (XR A : M1 |YR B ) + I (YM1 R B : M2 |XR A M1 ) + I (XM1 M2 R A : M3 |YR B M1 M2 ) · · · Provides new proof of IC = ACC , and extends to IC r = ACC r
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Intuition for Quantum Information Complexity Take channel simulation view for quantum protocol Purify everything I
Can apply to fully quantum, bipartite inputs and tasks xy
R
Reference Ain A1
x TA
Alice |ϕ
A3
U1
U3
C3
C2
C1
...
|Ψ
Bob y TB Bin
U2 B2
Quantum channel simulation with feedback and side information Equivalent to quantum state redistribution
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Definition of Quantum Information Complexity
Asymptotic communication cost is I (R : C |B) for R holding purification of input A / side information B, and output/message C I
In QSR, strong converse holds with free feedback [BCT14]
QIC (Π, µ) = I (R : C1 |B0 ) + I (R : C2 |A1 ) + I (R : C3 |B1 ) + · · · QIC (T ) = AQCC (T ) = limn→∞ n1 QCC (T ⊗n ) Satisfies all other desirable properties for an information complexity Single-shot protocol compression leads to first general multi-round direct sum result for quantum communicationBeyond complexity iid in information theory, BIRS, Banff
[email protected] Quantum Information Complexity
/ 25
Direct Sum for Quantum Communication I Direct sum: QCC ((f , )⊗n ) ≥ Ω(n · QCC (f , )) I
QIC (f , ) = limn→∞ n1 QCC ((f , )⊗n ) : direct sum related to compression down to QIC
T
Tn
T ≈
... (n times) T
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Direct Sum for Quantum Communication I
Direct sum: QCC ((f , )⊗n ) ≥ Ω(n · QCC (f , )) I
QIC (f , ) = limn→∞ n1 QCC ((f , )⊗n ) : direct sum related to compression down to QIC
We know: QCC r (f , µ, ) ≤ O(r 2 · QIC (f , µ, ) + r ) I I
Compare with classical: CC 7r (f , µ, ) ≤ O(IC r (f , µ, ) + r ) Can we improve on quantum compression?
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Direct Sum for Quantum Communication II Unbounded round? How to simultaneously compress many rounds with low information quantum messages? Open Q: QSR with no communication for I (C : R|B) ≤
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
QCC lower bound? x1 AND y1
DISJn = ¬OR(x AND y ) i
i
x2 AND y2
≈
... xn AND yn
Disjn : can we obtain exact QCC ? QIC0 (Disjn ) = nQIC0 (AND) holds √ But QCC (Disjn ) = θ( n)! √ I
I
Protocol achieving O( n) is highly interactive For a single message: Ω(n)
Bounded round QCC r (Disjn ): O( nr ) [AA03], Ω( rn2 ) [JRS03] First step with QIC : conjecture QCC r (Disjn ) ≥ Ω( nr ) I
Conjecture: QIC0r (AND) ≥ Ω( 1r )
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Bounded Round Disjointness
˜ n ) for n bits and r -round protocols Near-optimal bound Θ( r Joint work with Mark Braverman, Ankit Garg, Young Kun Ko and Jieming Mao ˜ Indirect approach to prove QIC0r (AND) ∈ Ω(1/r ) I I
Reduce back to Disjn !! Continuity in input distribution: dependance on r not present for classical IC
Possible direct approach: through CQMI lower bound I I
New lower bounds: [FR14] and generalization might be useful Can we remove polylog factor?
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Conclusion: Summary
Definition of QIC Operational interpretation Properties I
Difference in continuity in input
Multi-round direct sum Bounded round disjointness
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25
Research Directions
Improved Direct sum No communication QSR sampling / simultaneous multi round compression Concrete quantum communication complexity lower bound I
Tighter (exact?) disjointness
etc.
[email protected] Quantum Information Complexity
Beyond iid in information theory, BIRS, Banff / 25