[Solved] IE343 Final Project -Preliminary and Binary Kernel Logistic Regression

$25

File Name: IE343_Final_Project__Preliminary_and_Binary_Kernel_Logistic_Regression.zip
File Size: 659.4 KB

SKU: [Solved] IE343 Final Project -Preliminary and Binary Kernel Logistic Regression Category: Tag:
5/5 - (1 vote)

1 Preliminary and Binary Kernel Logistic Regression

We implemented the logistic regression in the binary and multiple labels classification at the midterm project. In the final project, we will extend the logistic regression to the kernel logistic regression.

Lets N and d denote the total number of the training data points and dimension of each data inputs dimension, respectively.

And let {(X(i),y(i))}i=1,2,,N be the training data sets. At the binary logistic regression, we train the parameter where the model was

  1. P. (2.1)

Here, concatenate(1,X) := (1,X1,X2,Xd) and < , > indicates the inner product.

Lets review the SVM. To escape the non-linear separable situation, the mapping function was defined which increase the dimension of the datas input space. For the hyper-plane 0+ < ,~ (X) >, the optimal ~ = PNj=1 jy(j)(X(j)).

From that ~, we obtain the optimal dual variable {j}j=1,2,,N by solving quadratic programming.

And from that mapping function , the kernel function K was defined by

K(X(i),X(j)) =< (X(i)),(X(j)) > . (2.2)

Now, lets deal with the kernel logistic regression from these backgrounds. For binary kernel logistic regression, the model can be constructed by

1

P(y = 1|X = (X1,X2,Xd)) = (2.3)

1 + exp(< (0,~),concatenate(1,(X)) >

(2.4)

(2.5)

(2.6)

1

(2.4) was derived by using). And at the (2.5), we redefine the 0 to 0.

In short, you need to implement the binary kernel logistic regression model

P (2.7)

by training parameter .

2 TODO for Code Implementation

There are 2 tasks for the code implementation. task 1 and task2 are for binary logistic regression and binary kernel logistic regression, respectively. task1 and task2 files are attached and they are almost same from midterm projects mid_task2_titanic folder. The different things is that I provide additional files ./App/Pre_processing/kernel.py and ./App/rbf.py. These are from the assignment 4 and will be helpful for the final project. Unlike midterm project, I already filled the getData() method in the .main.py. And you need to complete followings for the code implementation.

  • (task1)

It is similar to the midterm project. You need to fill out the ./App/logistic_regressor.py. The code should be implemented in the minus-log-likelihood loss function, epoch_num=5000 and lr=0.00005 settings. And main.py needs to print the 50 train accuracy results(print the test accuracy for every 100 epoch) and one test accuracy. Lastly, you need to use the bias term 0 of (2.1).

  • (task2)

You need to fill out the ./App/logistic_regressor.py and ./main.py . The code should be implemented in the minus-log-likelihood loss function, RBF kernels, epoch_num=5000 and lr=0.005 settings. RBF kernels should use the hyperparameter (1,1,1,1,1). And main.py needs to print the 50 train accuracy results(print the test accuracy for every 100 epoch) and one test accuracy. Lastly, you need to use the bias term 0 of (2.7).

Final Porject 2

3 TODO for Report

  • Write the kernel logistic regression model and logistic regression model for the binary labels data. For those models, which parameter we have to train? State the parameters dimension precisely.
  • Whats the needs of the kernel logistic regression? i.e. from kernel logistic regression, what we can recover the issue that original logistic regression cant solve and state the reason precisely.
  • State the advantages and disadvantages of kernel logistic regression and original logistic regression.
  • State how the weight update should happen for the original logistic regression and kernel logistic regression at the binary version. You need to write precisely how the gradient is induced also.
  • Construct the model for the original logistic regression and kernel logistic regression at the multiple labels(K : number of labels > 2). You need to clarify the parameters.
  • For the model you construct at the (5) and minus-log-likelihood loss function, write the pseudo code about each weight update situation. You also need to state the reason and input value in the method function precisely.
  • Describe the method about your logistic regression model with the code at the task1 line by line.
  • Describe the method about your kernel logistic regression model and py with the code at the task2 line by line.

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[Solved] IE343 Final Project -Preliminary and Binary Kernel Logistic Regression[Solved] IE343 Final Project -Preliminary and Binary Kernel Logistic Regression
$25