[Solved] CSE 802 Pattern Recognition and Analysis-Homework2

$25

File Name: CSE_802__Pattern_Recognition_and_Analysis-Homework2.zip
File Size: 480.42 KB

SKU: [Solved] CSE 802 – Pattern Recognition and Analysis-Homework2 Category: Tag:
5/5 - (1 vote)
  1. Consider a set of 1-dimensional feature values (i.e., points) pertaining to a class that is available here.
    • [3 points] Plot the histogram of these points using a bin size of 2, i.e., each bin should have a range of 2.
    • [4 points] Compute and report the mean and the biased variance of these points.
    • [3 points] Assuming that the given points are generated by an underlying Gaussian distribution, plot the pdf function on the same graph as (a).
  2. Consider the three-dimensional normal distribution p(x) N(,), where = (1,1,1)t and =

1 0 0

0 5 2.

0 2 5

  • [2 points] Compute and report the determinant of the covariance matrix, i.e., | |.
  • [2 points] Compute and report the inverse of the covariance matrix, i.e., 1.
  • [3 points] Compute and report the eigen-vectors and eigen-values of the covariance matrix.
  • [3 points] Compute and report the probability density at (0,0,0)t and at (5,5,5)t.
  • [2 points] Compute the Euclidean Distance between and the point (5,5,5)t.
  • [3 points] Compute the Mahalanobis Distance between and the point (5,5,5)t.
  1. [10 points] Suppose we have two normal distributions, corresponding to two classes (1 and 2) with the same covariances but different means: N(1,) and N(2,). In terms of their prior probabilities P(1) and P(2), state the condition that the Bayes decision boundary not pass between the two means.
  2. [10 points] Consider a two-class problem with the following class-conditional probability density functions (pdfs): p(x | 1) N(0,2)

and p(x | 2) N(1,2).

Show that the threshold, , corresponding to the Bayes decision boundary is:

= 2ln2112PP((12)),

where we have assumed that 11 =22 = 0.

  1. [15 points] Consider a two-category classification problem involving one feature, x. Let both classconditional densities conform to a Cauchy distribution as follows:

1 1 p(x | i)= b.1+ xbai 2, i = 1,2.

Assume a 0-1 loss function and equal priors.

  • Compute the Bayes decision boundary.
  • Show that the probability of misclassification according to the Bayes decision rule is

1 1

P(error)= 2 tan1a22ba1.

  • Plot P(error) as a function of |a2 a1|/b.
  • What is the maximum value of P(error). Under what conditions will this occur?
  1. [10 points] Consider the following class-conditional densities for a three-class problem involving twodimensional features:

p(x|1) N (1,1)t,I; p(x|2) N (1,1)t,I;

p.

(Here, class 3 conforms to a Gaussian Mixture Model (GMM) with two components one component is N (0.5,0.5)t,I and the other component is N whose weights are equal (i.e., )).

  • In a 2D graph, mark the mean of 1, 2, and the two components of 3. In the same graph, mark the point x =(0.1,0.1)t.
  • Assuming a 0-1 loss function and equal priors, determine the class to which you will assign the two-dimensional point x =(0.1,0.1)t based on the Bayes decision rule.
  1. Consider the following bivariate density function for the random variable x:

0 20 10

poriginal(x) N 0 , 10 30 .

  • [5 points] What is the whitening transform, Aw, of x? (You can use matlab (or any other programming language) to compute the eigenvectors/values).
  • [3points]Whenthewhiteningtransformationisappliedto x, whatisthedensityfunction, ptransf orm of the resulting random variable?
  • [5 points] Generate 10,000 bivariate random patterns from poriginal (if you are using matlab, then the mvnrnd function can be used to generate these patterns). Plot these patterns in a graph.
  • [5 points] Apply the whitening transform, Aw, to the 10,000 bivariate patterns generated above. Plot the transformed patterns in a separate graph.
  • [2 points] Compare the patterns in 7c and 7d. What do you observe?.
  1. Consider a two-category (1 and 2) classification problem with equal priors. Each feature is a twodimensional vector x = (x1, x2)t. The class-conditional densities are:

p(x|1) N(1 =(0,0)t,1 = 2I), p(x|2) N(2 =(2,2)t,2 = I).

  • [7 points] Compute the Bayes decision rule and the Bayes decision boundary.
  • [3 points] Generate 10,000 bivariate random patterns from each of the two densities (if you are using matlab, then the mvnrnd function can be used to generate these patterns). Plot these patterns in a graph using different markers to distinguish the two classes. On the same graph, plot the Bayes decision boundary.
  • [5 points] Compute and report the empirical error rate and the confusion matrix when classifying these 10,000 patterns using the Bayes decision policy.
  1. Consider a two-category classification problem involving two-dimensional feature vectors of the form x = (x1, x2)t. The two categories are 1 and 2, and

p(x | 1) N (1,1)t,I, p(x | 2) N (1,1)t,I,

P.

  • [10 points] Calculate the Bayes decision boundary and write down the Bayes decision rule assuming a 0-1 loss function. The Bayes decision rule has to be written in terms of the Bayes decision boundary.
  • [5 points] What are the Bhattacharyya and Chernoff theoretical bounds on the probability of misclassification, P(error)?
  • [5 points] Generate n = 25 test patterns from each of the two class-conditional densities and plot them in a two-dimensional feature space using different markers for the two categories (if you are using matlab, then the mvnrnd function can be used to generate these patterns). Draw the Bayes decision boundary on this plot for visualization purposes.
  • [5 points] What is the confusion matrix and empirical error rate when classifying the generated patterns using the Bayes decision rule?
  • [5 points] Can the empirical error rate exceed the theoretical bounds on the probability of misclassification? Why or why not? (Hint: Compute the empirical error rate multiple times by repeatedly generating n = 25 test patterns from each of the two class-conditional densities.)
  1. [15 points] Consider a 1-dimensional classification problem involving two categories 1 and 2 such that P(1) = 2/3 and P(2) = 1/ Assume that the classification process can result in one of three actions:

1 choose 1; 2 choose 2; 3 do not classify.

Consider the following loss function, :

(1|1)=(2|2)= 0; (2|1)=(1|2)= 1; (3|1)=(3|2)= 1/4.

For a given feature value x, assume that p(x|1)= 22x and p(x|2)= 1/2. Here, 0 x 2.

Based on the Bayes minimum risk rule, what action will be undertaken when encountering the value x = 0.5?

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[Solved] CSE 802 Pattern Recognition and Analysis-Homework2
$25