Implement (in C++) an algorithm that uses only perceptrons with threshold activation functions and the perceptron learning rule (chapter 4 [Mitchell, 1997]) to solve the binary A XOR B problem. Table 1 shows the correct inputs and outputs for the binary XOR problem.
Question 1: What is the minimum number of perceptrons required to solve the XOR problem, and how should they be connected?
Question 2: Devise a list of training examples to teach the perceptrons to solve the binary XOR problem. How many training examples did it take for your algorithm to correctly learn to solve the binary XOR problem? NOTE: Your perceptrons must not use linear or continuous activation functions, gradient descent or the back-propagation algorithm.
In a ZIP file, place the source code, executable, and a text file containing your list of training examples, as well as answers to question 1 and 2. Upload the ZIP file to Vula before 10.00 AM, 7 October, 2019.
1
Table 1: Inputs and outputs for the binary XOR problem.
Input 1 | Input 2 | Output |
1 | 1 | 0 |
1 | 0 | 1 |
0 | 1 | 1 |
0 | 0 | 0 |
References
[Mitchell, 1997] Mitchell, T. (1997). Machine Learning. McGraw Hill, New York, USA.
Reviews
There are no reviews yet.