, ,

[SOLVED] Ee276 homework #2 p0

$25

File Name: Ee276_homework__2_p0.zip
File Size: 188.4 KB

Categories: , , Tags: , ,
5/5 - (1 vote)

1. Data Processing Inequality.
The random variables X, Y and Z form a Markov triplet (X − Y − Z) if p(z|y) = p(z|y,x), and or, equivalently, if p(x|y) = p(x|y,z). If X, Y , Z form a Markov triplet (X − Y − Z), show that:
(a) H(X|Y ) = H(X|Y,Z) and H(Z|Y ) = H(Z|X,Y )
(b) H(X|Y ) ≤ H(X|Z)
(c) I(X;Y ) ≥ I(X;Z) and I(Y ;Z) ≥ I(X;Z)
(d) I(X;Z|Y ) = 0
where the conditional mutual information of random variables X and Y given Z is
defined by

2. Two looks.
Let X,Y1, and Y2 be binary random variables. Assume that I(X;Y1) = 0 and I(X;Y2) =
0.
(a) Does it follow that I(X;Y1,Y2) = 0? Prove or provide a counterexample.
(b) Does it follow that I(Y1;Y2) = 0? Prove or provide a counterexample.
3. Prefix and Uniquely Decodable codes Consider the following code:
u Codeword

(a) Is this a Prefix code?
(b) Argue that this code is uniquely decodable, by describing an algorithm for the decoding.
4. Relative entropy and the cost of miscoding. Let the random variable X be defined on {1,2,3,4,5,6} according to pmf p. Let p and another pmf q be
Symbol p(x) q(x) C1(x) C2(x)
1 1/2 1/2 0 0
2 1/8 1/4 100 10
3 1/8 1/16 101 1100
4 1/8 1/16 110 1101
5 1/16 1/16 1110 1110
6 1/16 1/16 1111 1111
(a) Calculate H(X), D(p||q) and D(q||p).
(b) The last two columns above represent codes for the random variable. Verify that codes C1 and C2 are optimal under the respective distributions p and q.
(c) Now assume that we use C2 to code X. What is the average length of the codewords? By how much does it exceed the entropy H(X), i.e., what is the redundancy of the code?
(d) What is the redundancy if we use code C1 for a random variable Y with pmf q?
5. (Strong) LLN and AEP. Let X1,X2,… be independent identically distributed random variables drawn according to the probability mass function p(x),x ∈
{1,2,…,m}.
Thus). We know that in probability. Let, where q is another probability mass function on {1,2,…,m}.
(a) Evaluate lim), where X1,X2,… are i.i.d. ∼ p(x).
(b) Now evaluate the limit of the log likelihood ratio when X1,X2,… are i.i.d. ∼ p(x). Thus the odds favoring q are exponentially small when p is true.
6. AEP. Let Xi for i ∈ {1,…,n} be an i.i.d. sequence from the p.m.f. p(x) with alphabet X = {1,2,…,m}. Denote the expectation and entropy of X by µ := E[X] and H := −Pp(x)logp(x) respectively. For ϵ > 0, recall the definition of the typical set

and define the following set
.
In what follows, ϵ > 0 is fixed.
(a) Does?
(b) Does?
(c) Show that for all n,
.
(d) Show that for all n sufficiently large:
.
7. An AEP-like limit and the AEP (Bonus)
(a) Let X1,X2,… be i.i.d. drawn according to probability mass function p(x). Find the limit in probability as n → ∞ of
.
(b) Let X1,X2,… be an i.i.d. sequence of discrete random variables with entropy
H(X). Let
Cn(t) = {xn ∈ X n : p(xn) ≥ 2−nt}
denote the subset of n-length sequences with probabilities ≥ 2−nt.
i. Show that |Cn(t)| ≤ 2nt.
ii. What is limn→∞ P(Xn ∈ Cn(t)) when t < H(X)? And when t > H(X)?

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] Ee276 homework #2 p0[SOLVED] Ee276 homework #2 p0
$25