Objective Question
Consider a K class classification problem. Total number of binary classifier trained in a one-vs-all setting are , and for one-vs-one classifier setting are .
2 Objective Question
Which of the following methods do we use to best fit the data in Logistic Regression?
- Least Square Error
- Maximum Likelihood
- Jaccard Distance
- Both A and B
3 Programming Question
In tutorial, we saw how to build a multi-class classifier using a one-vs-all setting. In this problem you are required to construct a one-vs-one classifier for the same problem. The starter code is provided in Logistic Regression Excercise 1.ipynb.
4 Subjective Question
Suppose you train a logistic regression classifier and your hypothesis function h is
h(x) = g(0 + 1x1 + 2x2)
where,
0 = 6, 1 = 0, 2 = 1
Draw the decision boundary for the given classifier (a rough sketch is sufficient). What would happen to the decision boundary if you replace the coefficient of x1 and x2? Draw the decision boundary for the second case as well.
1
5 Programming Question
You have been given a dataset of handwritten digits, the MNIST dataset. The dataset consists of 2828 pixel images consisting of one of 10 digits (0,1,,9) that are handwritten. Using logistic regression in one-vs-one and one-vs-all setting construct a 10-class classifier. Report the test accuracy for one-vs-one and one-vs-all classifier. The starter code is provided in Logistic Regression Excercise 2.ipynb.
2

![[Solved] SMAI Homework8-a K class classification problem](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip.jpg)

![[Solved] SMAI Homework1-Familiarity with Python, Jupyter Notebook, and Numpy](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip-1200x1200.jpg)
Reviews
There are no reviews yet.