When visiting your grandfather in Scotland, you decide to play

| August 30, 2017

Question
QUANTUM INFORMATION : ASSIGNMENT 1

Hand in your work at the start of the friday lecture on October 16th
Write clearly and include your arguments and overall strategy to solve each exercise
Problem 1 (8 marks)
(a) When visiting your grandfather in Scotland, you decide to play a game to prove to him technology
is useful. Every evening you alone listen to the weather forecast on the radio and then both you
and your grandfather try to guess if it will rain next morning. Having lived in Scotland since birth,
your grandfather knows that it rains on 80% of the days. You had reached the same conclusion on
previous summer holidays. You also know the weather forecast is right 80% of the time and is always
correct when it predicts rain. What is the optimal strategy for your grandfather ? And for you ?
(b) Having not won the bet, you study information theory because you are convinced you had more
information than your grandfather. Assuming you follow the radio forecast, compute the mutual
information between your guess and the actual weather, and do the same for your grandfather, to
quantify your expectation.
Problem 2 (4 marks) Let X1 and X2 be identically distributed but not necessarily independent random
variables. Let
H(X1 |X2 )
.
ρ=1−
H(X1 )
2
• Show that ρ = I(X1 ;X) ) .
H(X1
• Show that 0 ρ 1.
• What is the meaning of ρ = 0 ?
• What is the meaning of ρ = 1 ?
• Based on these observations, what does the mutual information measure ?
Problem 3 (8 marks)
(a) Prove the identities
I(X, Y ; Z) I(X; Z) ,

H(X, Y, Z) − H(X, Y )
(b) A function ρ(x, y) is a metric if for all x, y
ρ(x, y)

H(X, Z) − H(X) .
0,

ρ(x, y) = ρ(y, x) ,
ρ(x, y) = 0
ρ(x, y) + ρ(y, z)

x = y,

ρ(x, z) .

Show that ρ(X < Y ) = H(X|Y ) + H(Y |X) satisfies the first, second and fourth properties above.
If we say that X = Y if there is a one–to–one function mapping X to Y , the third property is also
satisfied, and consequently, ρ(X, Y ) is a metric.
Problem 4 (10 marks)
(a) Compute the channel capacity for the channel described in figure 1.
(b) Consider two discrete memoryless channels (X1 , p(y1 |x1 ), Y1 ) and (X2 , p(y2 |x2 ), Y2 ) with capacities
C1 and C2 respectively. A new channel (X1 × X2 , p(y1 |x1 ) · p(y2 |x2 ), Y1 × Y2 ) is formed in which
x1 ∈ X1 and x2 ∈ X2 are sent simultaneously, resulting in y1 and y2 , respectively. Find the capacity
of this channel.

Figure 1. Problem channel.

Classical Information

Research

Joan Sim´n
o

Notes on classical information theory

Contents
1 How to quantify classical information

2

2 Entropy, relative entropy and mutual information
2.1 Joint entropy & Conditional entropy . . . . . . . . . . . . . . . . . . . .
2.2 Relative entropy & mutual information . . . . . . . . . . . . . . . . . . .
2.2.1 Chain rules for entropy, relative entropy and mutual information
2.3 Jensen’s inequality and its consequences . . . . . . . . . . . . . . . . . .
2.4 Log sum inequality and its applications . . . . . . . . . . . . . . . . . .
2.5 Data processing inequality . . . . . . . . . . . . . . . . . . . . . . . . . .
2.6 Fano’s inequality . . . . . . . . . . . . . . . . . . . . . . . . .

Get a 30 % discount on an order above $ 100
Use the following coupon code:
RESEARCH
Order your essay today and save 30% with the discount code: RESEARCHOrder Now
Positive SSL