[Solved] CS596 Homework # 3

$25

File Name: CS596_Homework_#_3.zip
File Size: 169.56 KB

SKU: [Solved] CS596 Homework # 3 Category: Tag:
5/5 - (1 vote)

Problem 1: Let 1,2 be random variables which 99% of the time are independent and Normally (Gaussian) distributed, both with mean 0 and variance 1 and 1% of the time they are independent and Normally distributed both with mean 0 and variance 2 6= 1. a) Compute the joint pdf of the two random variables.

  1. b) Examine if the two random variables are independent. c) Give an example of two random variables that are uncorrelated but not independent.

Problem 2: Let , be random variables that are related through the equality

= | + s|.

  1. a) If the pdf of is f(x) compute the pdf of when s is a deterministic quantity. b) Repeat the previous question when s is a random variable independent from and takes only the two values 0 and 1 with probability 0.2 and 0.8 respectively. c) Under the assumptions of question b) compute the posterior probability P(s = 0| = z). Hint: For the computation of the pdf of a random variable the simplest way is to start with the computation of the cdf and then take the derivative. For b) use total probability.

Problem 3: You are given an unfair coin where every time you throw it the probability to observe head is p (0,1) (and tail 1 p) where p is unknown. Assume that you throw the coin N times and you report the sequence of heads (H) and tails (T). If Nh is the number of heads you observed and N Nh the number of tails then: a) Find the probability of the specific sequence you have observed as a function of p,N,Nh. b) Compute the maximum likelihood estimator (MLE) of p if you are given such a sequence. Is the result familiar/expected? c) Compute the average of your estimate and the mean square error from the true unknown value p. What do you conclude about your estimate as N ?

Problem 4: Consider the random data {x1,,xN} and assume that they are Markov related as follows

xn = xn1 + wn, n = 2,,N,

where || < 1, {wn} are independent and identically distributed Gaussian random variables with mean 0 and variance 1 and independent from x1. Random variable x1 is also Gaussian with mean 0 and variance . a) Using the fact that linear combinations of Gaussians is also Gaussian show that all xn are Gaussian.

  1. b) Find the joint probability density function of {x1,,xN} assuming is given. c) Find the maximum likelihood estimator (MLE) of parameter . Hint: For b) use the fact that {x1,x2 x1,,xN xN1} are independent Gaussians.

Problem 5: Assume that you have two hypotheses H0,H1. Under Hypothesis H0 your observation vector X = [x1,,xN]T is Gaussian with mean vector 0 and covariance matrix 0, while under H1 it is again Gaussian with mean 1 and covariance matrix 1. Assume that the two hypotheses are equiprobable (P(H0) = P(H1) = 0.5). a) Find the Likelihood ratio test (LRT) and its equivalent form by taking the logarithm on both sides of the inequality. b) What is the latter form reduced to when the two covariance matrices are equal (0 = 1)? c) What if additionally the two means are equal? Does this make sense?

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[Solved] CS596 Homework # 3
$25