New Features of SNNSv4.1
Version 4.1 of SNNS features the following improvements and extensions
over the earlier version 4.0:
- New batch execution language batchman
- Integration of the genetic algorithm tool Enzo. This tool allows the evolutionary optimization of neural networks. Enzo is available as seperate tar file on our ftp-server at the same location as SNNS. For all enquiries about Enzo please contact enzo-request@ira.uka.de.
- Improved window handling. On re-opening of a SNNS window it gets put on top, instead of issuing an error message.
- New learning algorithm 'scaled conjugated gradient' (SCG)
- The parameter lists in the control panel adapt to the number of paramters required, i.e. all widgets present need to be filled out.
- WinSNNS, a MS-Windows front-end to SNNS batch execution on unix workstations. WinSNNS is NOT part of our official SNNS distribution! It can be found at http://www.lans.ece.utexas.edu/winsnns.html. Please contact ghosh@pine.ece.utexas.edu for more information.
- Optional parameter `Teacher-Forcing' for Jordan networks introduced.
- Validation now possible for BPTT networks
- Extensive debugging:
- Seg-fault when trying to train a network without output units
- Seg-fault when pressing the info button without loaded patterns
- Wrong feedback loops in associative memory nets deleted
- Wrong activation functions in associative memory when constructed with bignet
- Wrong initialization function in kohonen networks when constructed with bignet
- Wrong error computation in Hebbian learning
- Wrong error computation of analyze function band
- Wrong MSE computation for validation
- Better font size adaption of label widgets
- Pruning now allows validation with 0 cycles
- GRAPH panel could sometimes not be closed
- Wrong scanning of update parameters in snnsbat
- Error in the manual: Missing parameter of BackpropWeightDecay

Last modified: mamier@vasarely Fri Dec 1 00:00:00 MET 1995