Next: Bibliography
Up: Decisions: Neural Nets(Old Style)
Previous: Summary of Chapter Five
- 1.
- Construct a binary data set in the plane.
Colour the points of one category red and the
other green, and display them on a computer screen.
Construct a three layer neural net
for a two dimensional input, and display the lines
representing the weights of the units in the
hidden layer; a small arrow sticking out of the
side of each plane should be added to indicate
which
is the positive side. Lock all the last weights
to the output unit at value 1. Program your
machine
to present data from the double spiral at random,
use Back-Propagation to adapt the hidden unit
weights, and display the (oriented) lines on the
screen. Run it with varying numbers of hidden
units
and monitor the Square Error. Verify that the
program is behaving itself by working with simple
data
sets first. Compare the times for convergence
for data sets of varying complexity, and observe
the
behaviour of the net when there is no solution.
Run your program on a double spiral data set.
- 2.
- With random initialisations of a set of
gaussians, try solving the above data sets using
the EM algorithm. Display the gaussians as ellipses
using the methods of chapter four. Use red
ellipses for the red data points, green ellipses
for the green data points, and run them quite
independently. Use a Bayesian method for determining
the correct category; also use the Mahalanobis
distance to the closest ellipse centre.
- 3.
- Increase the dimension progressively for
both of the above systems. You will not be able
to
display the results graphically now, but keep
records of times to convergence using
a variety of random initialisations for both systems.
- 4.
- What conclusions do you draw from observing
the performance of these two systems on different
kinds of data set?
Next: Bibliography
Up: Decisions: Neural Nets(Old Style)
Previous: Summary of Chapter Five
Mike Alder
9/19/1997