[Solved] CS 260: Machine Learning-Homework 5-Spring 2020

$25

File Name: CS_260:_Machine_Learning-Homework_5-Spring_2020.zip
File Size: 442.74 KB

SKU: [Solved] CS 260: Machine Learning-Homework 5-Spring 2020 Category: Tag:
5/5 - (1 vote)

The following questions are from Understanding Machine Learning: From Theory to Algorithms by Shai Shalev-Shwartz and Shai Ben-David. It can be found here http://www.cs.huji.ac.il/ shais/UnderstandingMachineLearning/ by courtesy of the authors.

In the following two problems, you will implement linear and kernel support vector machine using stochastic gradient descent. Please report your answer of the following questions to Gradescope, and submit your source code to CCLE. Your answer will NOT be graded if we do not see your source code submission on CCLE. Note that, you are allowed to use other programming languages. If so, you need to create an csv data loader and read the data from ./data/*.csv.

  1. (15 Points) Implement the SGD algorithm for solving the optimization problem of Soft-SVM with the objective function

(1)

  • Run the skeleton code, report the training and testing error obtained by the Linear SVM model implemented in scikit-learn.
  • Implement SGD algorithm for solving Soft-SVM. Replace line 119-121 of the skeleton code with your implementation.

The objective function (Eq. (1)) has been implemented in the function LinearSVM.objective. In this problem, we will keep a running average of updated weights, namely

Plot the objective value J(w(t)) with respect to the number of iterations t during training. Note that, optimal hyperparameter (in Eq. 1) and T (SGD maximum number of iterations) may vary with the implementation. Thus, you may need to tune the hyperparameter to reach good performance.

  • Report the training error and testing error on the dataset with the hyperparameter and T you used.
  1. (15 Points) Implement the SGD algorithm for solving the Soft-SVM optimization problem with RBF kernel. In this problem, we focus on the objective function of the dual Soft-SVM problem,

(2)

where the RBF kernel is defined as

  • Run the skeleton code, report the training and testing error obtained by the Soft-SVM model with RBF kernel implemented in scikit-learn.
  • Implement RBF kernel. That is, given two sets of samples X Rm1d and

Y Rm2d, where d number of dimensions and m1, m2 number of samples of X and Y respectively, the function RBF.__call__ returns a matrix K Rm1m2 with each element). Replace line 49-51 in the skeleton code with your implementation.

  • Implement SGD algorithm for solving Soft-SVM. Replace line 230-232 of the skeleton code with your implementation.

The objective function of dual SVM (Eq. (2)) has been implemented in the function RBFSVM.dual_objective. In this problem, we will keep a running average of the updated alphas,

Plot the objective function of dual SVM ((t)) with respect to the number of iterations t during training. Note that, optimal hyperparameter (for RBF kernel), T (SGD maximum number of iterations) and (in Eq. 1) may vary with the implementation. Thus, you may need to tune the hyperparameter to reach good performance.

  • Report the training and testing error on the dataset with the hyperparameter , T and you used.

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[Solved] CS 260: Machine Learning-Homework 5-Spring 2020
$25