Required reading: Wright and Recht, Ch. 6 and 7.
Throughout the problem set, k · k stands for the 2-norm k · k2.
1 (Exercise 1 in Wright and Recht, Ch. 6) In the ERM example of Section 6.1, assume that the objective function f is known at the current point x, along with the vector g = Ax. Show that the cost for computing f(x + γei) for some i = 1,2,…,n is O(|A·i|) [here, |A·i| denotes the number of nonzero entries in the ith column of A] – the same order as the cost of updating the gradient ∇f. Show that a similar observation holds for the graph example in Section 6.1.
2 (Exercise 1 in Wright and Recht, Ch. 7) Show that the Euclidean projection PΩ(x) onto the unit ball Ω = {x ∈ Rn : kxk ≤ 1} is given by
.
3 (Exercise 4 in Wright and Recht, Ch. 7) Consider the linear function f(x) := cTx of x ∈ Rn, where c ∈ Rn is a fixed vector. Find the minimizer of f over each of the following sets:
(a) The unit ball {x ∈ Rn : kxk ≤ 1}.
(b) The unit simplex {x ∈ Rn : xi ≥ 0 for all i,.
(c) The box {x ∈ Rn : 0 ≤ xi ≤ 1 for all i}.
4 (Exercise 6 in Wright and Recht, Ch. 7) Let Ω ⊆ Rn be a closed and convex set and let a continuously differentiable function f : Rn → R be given. Consider one step of projected gradient descent
with an arbitrary initial point x0 ∈ Ω. Prove that, for any αk > 0,
xk+1 = argmin x∈Ω
and kxk+1 − xkk2 ≤ αk∇f(xk)T(xk − xk+1).
1

![[SOLVED] Ece490 homework 4 p0](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip.jpg)

![[SOLVED] Cs7638 – project -particle filter –](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip-1200x1200.jpg)
Reviews
There are no reviews yet.