- Consider the Neural Network below.
All weights are initialized to the values shown (and there are no biases for simplicity). Consider the data point x = [1,1]T with desired output vector d = [1,0]T Complete one iteration of backpropagation by hand assuming a learning rate of = 0.1. What would all the weight values be after the one backpropagation iteration? Show your work. Use the following activation function:
(1)
- Derive the update equation for output layer neurons if the activation function usedis the hyperbolic tangent, (v) = tanh(v) (instead of the activation function used in the notes). Show your work.
- Derive the update equation for output layer neurons if the activation function used isthe softmax function,, where O is the number of output neurons and
vi is from the ith neuron. Note: In this case the output is multi-dimensional (i.e., di RO) and there will be a specific di for each output neuron. Show your work.

![[Solved] EEL5840-Homework 5](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip.jpg)

![[Solved] EEL5840-Homework 7](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip-1200x1200.jpg)
Reviews
There are no reviews yet.