Deep Learning - Exercise 05
You shall delve into the topic of Backpropagation.
1. Implement Backpropagation
Augment your own Multi-Layer Perceptron (MLP) class implementation by a method
backpropagation() that accepts a teacher vector as a parameter and runs the
Backpropagation algorithm to adjust all the weights in the MLP using the
formulas for the output and hidden neurons derived in the lecture.
If you have not yet implemented your own MLP class, you can use my implementation from exercise 04 at Github.
2. Test your implementation of Backpropagation
Think about two or three simple tests to check whether your implementation of Backpropagation is correctly. Then implement at least one of the tests and check whether your Backpropagation works as expected.