1. Example of joint entropy. Let p(x,y) be given by
@
@Y X @ 0 1
0
1
0
Find
(a) H(X),H(Y ).
(b) H(X | Y ),H(Y | X).
(c) H(X,Y ).
(d) H(Y ) − H(Y | X).
(e) I(X;Y ).
(f) Draw a Venn diagram for the quantities in (a) through (e).
Numerically round the answers to three decimal places.
2. Entropy of Hamming Code.
Hamming code is a simple error-correcting code that can correct up to one error in a sequence of bits. Now consider information bits X1,X2,X3,X4 ∈ {0,1} chosen uniformly at random, together with check bits X5,X6,X7 chosen to make the parity of the circles even.
(eg: X1 + X2 + X4 + X7 = 0 mod 2)
Thus, for example,
becomes
That is, 1011 becomes 1011010.
(a) What is the entropy H(X1,X2,…,X7) of X := (X1,…,X7)?
Now we make an error (or not) in one of the bits (or none). Let Y = X ⊕ e, where e is equally likely to be (1,0,…,0),(0,1,0,…,0),…,(0,0,…,0,1), or (0,0,…,0), and e is independent of X.
(b) Show that one can recover the message X perfectly from Y. (Please provide a justification, detailed proof not required.) (c) What is H(X|Y)?
(d) What is I(X;Y)?
(e) What is the entropy of Y?
3. Entropy of functions of a random variable. Let X be a discrete random variable. Show that the entropy of a function of X is less than or equal to the entropy of X by justifying the following steps:
(a)
H(X,g(X)) =
(b) H(X) + H(g(X) | X) (1)
= H(X); (2)
H(X,g(X)) (c)
=
(d) H(g(X)) + H(X | g(X)) (3)
≥ H(g(X)). (4)
Thus H(g(X)) ≤ H(X).
4. Coin flips. A fair coin is flipped until the first head occurs. Let X denote the number of flips required.
, .
(b) A random variable X is drawn according to this distribution. Construct an “efficient” sequence of yes-no questions of the form, “Is X contained in the set S?” that determine the value of X. Compare H(X) to the expected number of questions required to determine X.
5. Minimum entropy. In the following, we use H(p1,…,pn) ≡ H(p) to denote the entropy H(X) of a random variable X with alphabet X := {1,…,n}, i.e.,
.
What is the minimum value of H(p1,…,pn) = H(p) as p ranges over the set of ndimensional probability vectors? Find all p’s which achieve this minimum.
6. Mixing increases entropy. Let pi > 0, i = 1,2,…m. Show that the entropy of a random variable distributed according to (p1,…,pi,…,pj,…,pm), is less than the entropy of a random variable distributed according to (
7. Infinite entropy. [Bonus]
This problem shows that the entropy of a discrete random variable can be infinite. (In this question you can take log as the natural logarithm for simplicity.)
(a) Let. Show that A is finite by bounding the infinite sum by the integral of (xlog2 x)−1.
(b) Show that the integer-valued random variable X distributed as:
P(X = n) = (Anlog2 n)−1 for n = 2, 3, … has entropy H(X) given by:
(c) Show that the entropy H(X) = +∞ (by showing that the sum diverges).

![[SOLVED] Ee276 homework #1 p0](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip.jpg)

![[SOLVED] Physics 396 homework set 10](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip-1200x1200.jpg)
Reviews
There are no reviews yet.