, ,

[SOLVED] Ee276 homework #6 p0

$25

File Name: Ee276_homework__6_p0.zip
File Size: 188.4 KB

Categories: , , Tags: , ,
5/5 - (1 vote)

1. Time-varying channels.
Consider a time-varying discrete memoryless channel. Let Y1,Y2,…,Yn be conditionally independent given X1,X2,…,Xn, with conditional distribution given by p(y|x) = Qni=1 pi(yi|xi) (where pi(yi|xi) is a BSC(δi) as shown in figure). Let X = (X1,X2,…,Xn), Y = (Y1,Y2,…,Yn).
01 PPPPPPP1 −PPδiPPPδδPii P–1Pq 01
1 − δi
In this problem, we show that
maxI(X;Y) = PX
(a) Show that I(X;Y) )) for any PX.
Hint: Use a chain of inequalities similar to the channel coding converse proof.
(b) Find a distribution over X for which I(X;Y) =
2. Jointly typical sequences. We will calculate the jointly typical set for a pair of random variables connected by a binary symmetric channel, and the probability of error for jointly typical decoding for such a channel.
0 Q 0.9 -3 0
Q Q Q
Q
Q
Q 0.1 Q
Q 0.1 Q
Q
Q
Q
Q
1 -Qs 1
0.9
We will consider a binary symmetric channel with crossover probability 0.1. The input distribution that achieves capacity is the uniform distribution, i.e., p(x) = (1/2,1/2), which yields the joint distribution p(x,y) for this channel given by
XY 0
0 0.45
1 0.05
0.05
0.45
The marginal distribution of Y is also (1/2,1/2).
(a) Calculate H(X), H(Y ), H(X,Y ) and I(X;Y ) for the joint distribution above.
(b) Let X1,X2,…,Xn be drawn i.i.d. according the Bernoulli(1/2) distribution. Of the 2n possible input sequences of length n, which of them are typical, i.e., member of) for ϵ = 0.2? Which are the typical sequences in)?
(c) The jointly typical set) is defined as the set of n-sequence pairs (xn,yn) that satisfy the following equations:

The first two equations correspond to the conditions that xn and yn are in and) respectively. Consider the last condition, which can be rewritten to state that ). Let k be the number of places in which the sequence xn differs from yn (k is a function of the two sequences). Then we can write
(1)
(2)
(3)
An alternative way at looking at this probability is to look at the binary symmetric channel as an additive channel Y = X ⊕ Z, where Z is a binary random variable that is equal to 1 with probability p, and is independent of X. In this case,
(4)
(5)
(6)
(7)
Show that the condition that (xn,yn) is jointly typical is equivalent to the condition that xn is typical and zn = yn − xn is typical.
(d) We now calculate the size of) for n = 25 and ϵ = 0.2. Here is a table of the probabilities and numbers of sequences of with k ones
k
0 1 0.071790 0.152003
1 25 0.199416 0.278800
2 300 0.265888 0.405597
3 2300 0.226497 0.532394
4 12650 0.138415 0.659191
5 53130 0.064594 0.785988
6 177100 0.023924 0.912785
7 480700 0.007215 1.039582
8 1081575 0.001804 1.166379
9 2042975 0.000379 1.293176
10 3268760 0.000067 1.419973
11 4457400 0.000010 1.546770
12 5200300 0.000001 1.673567
(Sequences with more than 12 ones are omitted since their total probability is negligible (and they are not in the typical set).) What is the size of the set)?
(f) Now consider a particular sequence yn = 000000…0, say. Assume that we choose a sequence Xn at random, uniformly distributed among all the 2n possible binary n-sequences. What is the probability that the chosen sequence is jointly typical with this yn? (Hint: this is the probability of all sequences xn such that yn−xn ∈

(g) Now consider a code with 29 = 512 codewords chosen at random, uniformly distributed among all the 2n sequences of length n = 25. One of these codewords, say the one corresponding to i = 1, is chosen and sent over the channel. As calculated in part (e), the received sequence, with high probability, is jointly typical with the codeword that was sent. What is probability that one or more of the other codewords (which were chosen at random, independently of the sent codeword) is jointly typical with the received sequence? (Hint: You could use the union bound but you could also calculate this probability exactly, using the result of part (f) and the independence of the codewords)
(h) Given that a particular codeword was sent, the probability of error (averaged over the probability distribution of the channel and over the random choice of other codewords) can be written as
Pr(Error|xn(1) sent) = X p(yn|xn(1)) (8) yn:yncauses error
There are two kinds of error: the first occurs if the received sequence yn is not jointly typical with the transmitted codeword, and the second occurs if there is another codeword jointly typical with the received sequence. Using the result of the previous parts, calculate this probability of error. (Hint: You could use the union bound but you could calculate it more accurately using the fact that the probability of error does not depend on which codeword was sent, by the symmetry of the random coding argument.)
3. BSC with feedback. Suppose that feedback is used for a binary symmetric channel with crossover probability parameter p. Each time a channel output is received, it becomes the next transmission: X1 is Bern(1/2), X2 = Y1, X3 = Y2, …, Xn = Yn−1. Find lim). How does it compare to the capacity of this channel?
4. Fano’s inequality. Let Pr(X = i) = pi, i = 1,2,…,m and let p1 ≥ p2 ≥ p3 ≥ ··· ≥ pm. The minimal probability of error predictor of X is Xˆ = 1, with resulting probability of error Pe = 1 − p1. Maximize H(X) subject to the constraint 1 − p1 = Pe to find a bound on Pe in terms of H. This is Fano’s inequality in the absence of conditioning.
Hint: Consider PMF (p2/Pe,p3/Pe,…,pm/Pe).

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] Ee276 homework #6 p0[SOLVED] Ee276 homework #6 p0
$25