(a) The product must be split up into two lines to fit on the page. Note that by delete column 1 in step 7 we interpret this to mean that the first column of the matrix should be removed entirely in contrast with the possible interpretation that the first column of the matrix should be zeroed out.
Exercise 1.2
(a) We wish to write a matrix equation of the form:
.
We find individual force equations of the system to be:
f1 = k12(x2 x1 l12)
f2 = k23(x3 x2 l23) f3 = k34(x4 x3 l34) f4 = 0
We can collect this information into an intermediate step in writing the matrix equation given by:
It is only possible to rewrite this equation in the form f = Kx by using the fact that x1,x2,x3,x4 6= 0 and factoring these values out to arrive at:
Exercise 1.3
Let R Cmm by an upper triangular matrix, i.e., rin = 0 if i > n and suppose that R is nonsingular. We will show that R1 is also upper triangular. If we denote the i,n entry of R1 by rin1 this means we must show that rin1 = 0 if i > n.
Since R is nonsingular, the columns of R are linearly independent. In particular if we consider the first n m columns of R, which we denote by R1,R2,,Rn, this subcollection of columns of R must also be linearly independent. Since rij = 0 for i > j, the set {R1,,Rn} spans the vector space {y Cm : yi = 0 if i > n} and because R1,,Rn are linearly independent, they form a basis for this vector space. Now consider the product
.
Note that the nth (with n m) column of I is the canonical unit vector en and the equation above shows that en is the results of taking the linear combination of R1,,Rn,..,Rm with the coefficients as the elements of the nth column of R1. That is, using equation 1.8, we have
- en = Xrin1Ri = r1n1R1 + r2n1R2 + rnn1Rn + + rmn1Rm .
i=1
Since R is nonsingular, the coefficients rin1 in the equation above are uniquely defined by Theorem 1.2. But since en has all zero entries for any entry beyond the nth position (if n = m this reasoning will still apply but there are no zero entries below the mth position), we see that en is in the span of the columns {R1,,Rn}. This means that
where the coefficients are the same as in the previous equation because if other coefficients were possible this would contradict the fact that the columns of R (and therefore the first n columns) are linearly independent. Thus,
- m
en = Xrin1Ri = Xrin1Ri
i=1 i=1
which implies that rin1 = 0 if i > n. But since n was an arbitrary integer such that 1 n m, we have shown by the definition of an upper triangular matrix that R1 is upper triangular.
Exercise 1.4
Let f1,,f8 : [0,8] C be a set of functions with the property that for any choice of numbers d1,,d8 C there exists a set of coefficients c1,,c8 C such that
8
X
cjfj(i) = di i = 1,,8. j=1
(a) We will show that choosing d1,,d8 will determine c1,,c8 uniquely.
Fix an arbitrary selection d1,,d8. By hypothesis we have the system of equations
d1 = c1f1(1) + c2f2(1) + + c8f8(1) d2 = c1f1(2) + c2f2(2) + + c8f8(2)
d8 = c1f1(8) + c2f2(8) + + c8f8(8)
which gives the matrix equation
Using the notation defined above, our hypothesis guarantees: For any d C8, there exists an element c C8 such that d = Fc. But this statement means that for the matrix F C88, range(F) = C8. By Theorem 1.3, this is equivalent to the statement that rank(F) = 8, i.e., F is a full rank 8 8 matrix. Then by Theorem 1.2, F maps two distinct vectors to the same vector. In other words, the vector c is uniquely determined and therefore we conclude that the elements of c, which are c1,,c8 are uniquely determined by d1,,d8, which are the elements of d.
Exercise 2.2
The Pythagorean theorem asserts that for a set of n orthogonal vectors {xi},
.
- Prove this in the case n = 2 by an explicit computation of kx1+x2k2.
||x1 + x2||2 = (x1 + x2)(x1 + x2)
= x1(x1 + x2) + x2(x1 + x2) (by bilinearity)
(by bilinearity)
(by orthogonality)
= ||x1||2 + ||x2||2 (by definition of Euclidean length)
- For n = 1, the assertion is immediate seen to hold since this becomes
||x1||2 = ||x1||2 .
(Note that if the base case in an inductive proof was supposed to be the case n = 2 then we have also already established this in part a).
Assume the inductive hypothesis. That is, assume it is true that for some n N,
n n kXxik2 = Xkxik2 .
i=1 i=1
Now consider the case for n+1 orthogonal vectors. We will use the fact that is actually just a vector (the sume of vectors is a vector) and that the vectoris orthogonal to the vector xn+1 since
.
This means we may apply the result of part a to the two vectors and xn+1:
(by part a)
i=1
n
= X||xi||2 + ||xn+1||2 (by induction hypothesis)
i=1
n+1
= X||xi||2
i=1
Exercise 2.3
Let A Cmm be hermitian, i.e. A = A = AT = (A)T. An eigenvector of A is a nonzero vector x Cm such that Ax = x for some C, the corresponding eigenvalue.
- Prove that all eigenvalues of A are real.
Let (,x) be an eigenvalue, eigenvector pair for the hermitian matrix A Cmm. We have:
(using bilinearity and eqn 2.4)
xAx = xx (since (x) = x )
(A = A)
Since xx = xx, we see that xx = xx and since x 6= 0 it follows
that x x > 0. Therefore it must be the case that = , which means that R. Since was an arbitrary eigenvalue, we conclude that all eigenvalues of A are real.
- Prove that if x and y are eigenvectors corresponding to distinct eigenvalues, then x and y are orthogonal.
Let x,y be eigenvectors of the hermitian matrix A Cmm corresponding to eigenvalues x,y so that Ax = xx and Ay = yy. We have:
by part a
Suppose that x,y are not orthogonal so that yx 6= 0. Then this last equality would show that x = y, which is a contradiction. Therefore, we must conclude that yx = 0 meaning that x and y are orthogonal.
Exercise 2.4
What can be said about the eigenvalues of a unitary matrix?
Response: If is an eigenvalue of the unitary matrix Q Cmm then || = 1.
Proof: Since Q is unitary, we have QQ = QQ = I. We have Qx = x which means that ||Qx|| = ||x||. But by equation 2.10 we also have ||Qx|| = ||x||. Also, since ||x|| = |x|| it follows that ||x|| = ||||x||. Since x is an eigenvector, x 6= 0 and so ||x|| > 0. Dividing through by ||x|| we arrive at 1 = ||.
Exercise 2.5
Let S Cmm be skew-hermitian, i.e., S = S.
(a) Show by Exercise 2.3 that the eigenvalues of S are pure imaginary.
Let C be an eigenvalue of the skew-hermitian matrix S with = a+bi for some a,b R with corresponding eigenvector x. We will show that a = 0.
Therefore xx = xx. Since xx = ||x||2 6= 0, this means that
a + bi = = = (a bi)
a + bi = a + bi = a = a = a = 0 .
Therefore, we see that = bi is pure imaginary, as we wanted to
Reviews
There are no reviews yet.