, ,

[SOLVED] Ee276 homework #5 p0

$25

File Name: Ee276_homework__5_p0.zip
File Size: 188.4 KB

Categories: , , Tags: , ,
5/5 - (1 vote)

1. Minimizing Channel Probability of Error.
J – Encoder Xn – Memoryless Channel
PY |X Y n – Decoder Jˆ –
Message Estimate
of
Message
J is a message uniformly distributed on {1,2,…,M} passed into the system. The encoder maps message J onto its corresponding n-length codeword Xn from codebook cn = {Xn(1),Xn(2),…,Xn(M)}. The encoded message is sent through a memoryless channel characterized by PY |X, and we receive Y n as output.
The decoder is responsible for estimating J from Y n; it is a function Jˆ that maps Y n to one of the symbols in {1,2,…,M,error}. We define the probability of error Pe = P(Jˆ(Y n) ̸= J). Show that Pe, for a fixed codebook cn, is minimized by:
Jˆ(yn) = argmax1≤j≤MP(J = j|Y n = yn).
2. The two-look Gaussian channel.
– –

X(Y1,Y2)
Consider the ordinary Gaussian channel with two correlated looks at X, i.e., Y = (Y1,Y2), where
Y1 = X + Z1 (1)
Y2 = X + Z2 (2)
with a power constraint P on X, and (Z1,Z2) ∼ N2(0,K), where
. (3)
Find the capacity C for
(a) ρ = 1
(b) ρ = 0
(c) ρ = -1
Hint: The formula for the differential entropy of a Gaussian random vector can be found in the book (refer to Theorem 8.2). It can also be derived based on the formulas for differential entropy and the Gaussian PDF.
3. Output power constraint. Consider an additive white Gaussian noise channel with an expected output power constraint P. Thus Y = X + Z, Z ∼ N(0,σ2), Z is independent of X, and EY 2 ≤ P. Find the channel capacity.
4. Gaussian mutual information
Suppose that (X,Y,Z) are jointly Gaussian and that X → Y → Z forms a Markov chain. Let X and Y have correlation coefficient ρ1 and let Y and Z have correlation coefficient ρ2. Find I(X;Z).
Hint: Refer to Theorem 8.2 in the textbook. In the case of a bivariate normal distribution,, where ρX,Y is the correlation coefficient and σA is the standard deviation of random variable A.
5. Bottleneck channel
Suppose a signal X ∈ X = {1,2,…,m} goes through an intervening transition X −→ V −→ Y :
p(v|x) V p(y|v)
XY
where X = {1,2,…,m}, Y = {1,2,…,m}, and
V = {1,2,…,k}. Here p(v|x) and p(y|v) are arbitrary and the channel has transition probability p(y|x) = Pv p(v|x)p(y|v).
Show C ≤ logk.
6. Joint typicality.
Let (Xi,Yi,Zi) be i.i.d. according to p(x,y,z). With finite alphabets, we will say that
(xn,yn,zn) is jointly typical (written () if
• p(xn) ∈ 2−n(H(X)±ϵ)
• p(yn) ∈ 2−n(H(Y )±ϵ)
• p(zn) ∈ 2−n(H(Z)±ϵ)
• p(xn,yn) ∈ 2−n(H(X,Y )±ϵ)
• p(xn,zn) ∈ 2−n(H(X,Z)±ϵ)
• p(yn,zn) ∈ 2−n(H(Y,Z)±ϵ)
• p(xn,yn,zn) ∈ 2−n(H(X,Y,Z)±ϵ)
Note that p(a) ∈ 2−n(k±ϵ) means that .
Now suppose () is drawn according to p(xn)p(yn)p(zn). Thus have the same marginals as p(xn,yn,zn) but are independent. Find (bounds on) in terms of the entropies H(X), H(Y ), H(Z), H(X,Y ), H(X,Z), H(Y,Z) and H(X,Y,Z).

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] Ee276 homework #5 p0[SOLVED] Ee276 homework #5 p0
$25