[SOLVED] 代写 R python 

30 $

File Name: 代写_R_python_.zip
File Size: 178.98 KB

SKU: 9497085257 Category: Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Or Upload Your Assignment Here:



Recent Changes – Search:
Main Page
Syllabus
Campuswire
Submitty
Assignments
Assign1
Assign2
edit SideBar
View Edit History Print
Dmcourse /
Assign1
Assign 1: Due Date: Sat 21st Sep, 11:59:59PM (i.e., just before midnight)
Download the Airfoil Data Set dataset from the UCI Machine Learning Repository. The dataset has 6 real valued attributes. We will treat all attributes equally for this assignment, which consists of two parts. Part I must be done by students in both sections, namely CSCI4390 and CSCI6390. For the second part, Part II-4390 must be done by those in CSCI4390, and Part II-CSCI6390 must be done by students registered for CSCI6390.

Part I (both CSCI-4390 and CSCI-6390)
a. Mean vector and total variance
Compute the mean vector
μ
μ
for the 6-dimensional data matrix, and then compute the total variance
v
a
r
(
D
)
v
a
r
(
D
)
; see Eq. (1.4) for the latter.

b. Covariance matrix (inner and outer product form)
Compute the sample covariance matrix
Σ
Σ
as inner products between the attributes of the centered data matrix (see Eq. (2.30) in chapter 2). Next compute the sample covariance matrix as sum of the outer products between the centered points (see Eq. (2.31)).

c. Correlation matrix as pair-wise cosines
Compute the correlation matrix for this dataset using the formula for the cosine between centered attribute vectors (see Eq. (2.25)). Which attributes are the most correlated, the most anti-correlated, and least correlated? Create the scatter plots for these interesting pairs using matplotlib and visually confirm the trends, i.e., describe how each of the three cases results in a particular type of plot.

Part II-4390 (To be coded only by CSCI-4390)
Dominant Eigenvector
Compute the dominant eigenvalue and eigenvector of the covariance matrix
Σ
Σ
via the power-iteration method. One can compute the dominant eigen-vector/-value of the covariance matrix iteratively as follows.
Let
x
0
=






1
1

1






x
0
=
(
1
1

1
)
be the starting vector in
R
d
R
d
, where
d
d
is the number of dimensions.
In each iteration
i
i
, we compute the new vector:
x
i
=
Σ
x
i

1
x
i
=
Σ

x
i

1
We then find the element of
x
i
x
i
that has the maximum absolute value, say at index
m
m
. For the next round, to avoid numerical issues with large values, we re-scale
x
i
x
i
by dividing all elements by
x
i
m
x
i
m
, so that the largest value is always 1 before we begin the next iteration.
To test convergence, you may compute the norm of the difference between the scaled vectors from the current iteration and the previous one, and you can stop if this norm falls below some threshold. That is, stop if

x
i

x
i

1

2
<ϵ‖xi−xi−1‖2<ϵFor the final eigen-vector, make sure to normalize it, so that it has unit length. Also, the ratioximxi−1,mximxi−1,mgives you the largest eigenvalue. If you did the scaling as described above, then the denominator will be 1, but the numerator will be the updated value of that element before scaling. Once you have obtained the dominant eigenvector,u1u1, project each of the original data pointsxixionto this vector, and print the coordinates for the new points along this “direction”. Part II-6390 (To be coded only by CSCI-6390)First Two Eigenvectors and EigenvaluesCompute the first two eigenvectors of the covariance matrixΣΣusing a generalization of the above iterative method. LetX0X0be ad×2d×2(random) matrix with two non-zerodd-dimensional column vectors with unit length. We will iteratively multiplyX0X0withΣΣon the left. The first column will not be modified, but the second column will be orthogonalized with respect to the first one by subtracting its projection along the first column (see section 1.3.3 in chapter 1). That is, letaaandbbdenote the first and second column ofX1X1, whereX1=ΣX0X1=ΣX0Then we orthogonalizebbas follows:b=b−(bTaaTa)ab=b−(bTaaTa)aAfter thisbbis guaranteed to be orthogonal toaa. This will yield the matrixX1X1with the two column vectors denoting the current estimates for the first and second eigenvectors. Before the next iteration, normalize each column to be unit length, and repeat the whole process. That is, fromX1X1obtainX2X2and so on, until convergence. To test for convergence, you can look at the distance betweenXiXiandXi−1Xi−1. If the difference is less than some thresholdϵϵthen we stop. Once you have obtained the two eigenvectors:u1u1andu2u2, project each of the original data pointsxixionto those two vectors, to obtain the new projected points in 2D. Plot these projected points in the two new dimensions. What to Turn InWrite a script named assign1.py that takes as input the data filename, and the epsilon parameter for convergence. You may assume that the data file resides in the local directory where the script will be called from. You can use epsilon as 0.001 or 0.0001. Save all your output to a pdf file named assign1.pdf. The output should comprise the mean vector, total variance, covariance matrix via inner and via outer product formulas, correlation matrix, the observations, the dominant eigen-vectors and eigenvalues. The scatter plots should also be part of this output file as well, with any required comments. You should submit your python script and the output PDF file via the Submitty page: https://submitty.cs.rpi.edu/f19/csci4390 This link will work for both CSCI4390 and CSCI6390 students; your account has already been set up. You can login via your RCS username (NOT RIN) and RCS password. You must test out the Submitty login right away, and not wait until the due date to test it. Your script must use Python version 3. Please note that you can use built-in NumPy/Python functions for reading and parsing the text input, but you should NOT use any of the built-in functions like cov or eigen for this assignment. You may however verify your answers by comparing to the results from the built-in methods. Tutorial on Python and NumPyFor those not that familiar with python, you may google for tutorials, e.g. Python tutorial. For information on the NumPy module/package use: NumPy quickstart Policy on Academic HonestyYou are free to discuss how to tackle the assignment, but all coding must be your own. Please do not cut, copy and/or paste from anyone else, including code on the web. Any students caught violating the academic honesty principle will get an automatic F grade on the course and will be referred to the dean of students for disciplinary action. Edit – History – Print – Recent Changes – SearchPage last modified on September 17, 2019, at 03:09 PM

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] 代写 R python 
30 $