CS 4804 Homework #6

Date Assigned: April 2, 2003
Date Due: April 11, 2003, in class, before class starts
  1. (15 points) Given two boolean input variables, how many boolean functions are possible? How many of these can be learned by a perceptron? Also, repeat the above two questions for three input variables.

  2. (15 points) Exercise 20.19 of your textbook.

  3. (20 points) For this question, we will use the data from the table shown in Exercise 20.15 of your textbook (only the table, not the full question). These data were generated from a form of "voting" function, where each input has a different number of votes - 10 for I1, 4 for I2 to I4, 2 for I5, and 1 for I6. For a given example, tally up the votes that "0" gets and that "1" gets and declare one of them to be the winner. As you can see, I6 often casts a `tie breaking vote.'

    Is a perceptron sufficient for learning this function? Train a sigmoid perceptron using the given data and see if the learned weights correspond to the voting function revealed above. How can you improve the learning of the perceptron?

  4. (30 points) In this problem, we will derive our own perceptron called a "wacky perceptron". This perceptron takes only one input (x) but has 4 weights (w0, w1, w2, w3). Its output is essentially the function w0 + w1*x + w2*x*x + w3*x*x*x (it is not squashed by a threshold or a sigmoid). Derive a training rule for this perceptron based on given data (by minimizing sum of squared errors between actual outputs and target outputs).

    Then, use the data given in this file and learn the weights for your wacky perceptron. What do you observe? Every line in the file is one instance and lists them in (x,output) format.

  5. (20 points) Create a neural network with one input layer (two nodes), one hidden layer (two nodes), and one output layer (one node). All units are sigmoids and all weights are initialized to zero small values (keep in mind that every unit will have a bias weight, whose input is hardwired to +1). Train the neural network using the backpropagation algorithm on the XNOR function (this is the negation of XOR).

    Does your network converge? If so, present the final weights neatly labeled on the diagram of the network.

Return Home