- ...XGUI
- X Graphical User Interface
- ...1.
-
Backpercolation 1 was developed by JURIK RESEARCH
& CONSULTING, PO 2379, Aptos, CA USA. Any and all SALES of
products (commercial, industrial, or otherwise) that utilize the
Backpercolation 1 process or its derivatives require a license from
JURIK RESEARCH & CONSUL-
TING. Write for details.
- ...units
- In the following the more
common name ''units'' is used instead of ''cells''.
- ...unit.
- The term transfer
function often denotes the combination of activation and output
function. To make matters worse, sometimes the term activation
function is also used to comprise activation and output function.
- ...number
- This number can change after saving but remains
unambiguous. See also chapter

- ...range
- Mathematically correct would be 16#16, but
the values 0 and 1 are reached due to arithmetic inaccuracy.
- ...layers
- Changing it to 16 layers can be done very easily in
the source code of the interface.
- ...training.
- If
you do this scale the y-range to lie between 0 and 26 by clicking on
the 'right-arrow' next the 'Scale Y:' a few times. You can also
resize the window containing the graph.
- ...anymore
- If a
frozen display has to be redrawn, e.g. because an overlapping window
was moved, it gets updated. If the network has changed since the
freeze, its contents will also have changed!
- ...closed
- The loss of power by graph
should be minimal.
- ...possible
-
SNNSv4.0 reads all pattern file formats, but writes only the new,
flexible format. This way SNNS itself can be used as a conversion utility.
- ...values
- C is the value read from line 0005
- ...part
- The F641#641 layer consists of three internal layers. See
chapter
.
- ...weight
- Every mean vector 816#816 of
a class is represented by a class unit. The elements of these vectors
are stored in the weights between class unit and the input units.
- ...units
- This case may be transformed
into a network with an additional hidden unit for each input unit and
a single connection with unity weight from each input unit to its
corresponding hidden unit.
- ...step
- If only an upper bound n for the
number of processing steps is known, the input patterns may consist of
windows containing the current input pattern together with a sequence
of the previous n-1 input patterns. The network then develops a
focus to the sequence element in the input window corresponding to the
best number of processing steps.
- ...layer.
- The candidate units are realized as special
units in SNNS.
- ...(MLPs)
- As usual the
term MLP refers to a multilayer feedforward network using the scalar
product as a propagation rule and sigmoids as transfer functions.
- ...lie
- The only
exception to this rule is the case where a pattern of the same class
lies in the area of conflict but is covered by another RBF (of the
correct class) with a sufficiently high activation.
- ...pairs
- In this case the term ``input--class pair'' would
be more justified, since the DDA--Algorithm trains the network to
classify rather than approximate an input--output mapping.
- ...correctly
- This is only important for
the chosen realization of the ART1 learning algorithm in SNNS
- ...mapped
- Different ART1132#1132 classes may be
mapped onto the same category.
- ...vector
- c will be used as index for the winning
unit in the competitive layer throughout this text
- ...neighborhood
- Neighborhood is defined as the set of units within a
certain radius of the winner. So 1168#1168 would be the the eight direct
neighbors in the 2D grid; 1169#1169 would be 1170#1170 plus the 16 next
closest; etc.
- ...detail
- For any comments or questions
concerning the implementation of an autoassociative memory please
refer to Jamie DeCoster at jamie@psych.purdue.edu
- ...criterion
- As it is
not rare that SCG can not reduce the error during a few consecutive
epochs, this criterion is computed only when 1236#1236. Without such a precaution, this criterion would stop SCG
too early.
- ...generalization
- Generalization: ability of a neural
net to recognize unseen patterns (test set) after training
- ...10pm
- This construction is
necessary since `at' can read only from stdin.
- ...IO-type
- The term T-type was changed to
IO-type after completion of the kernel
Niels.Mache@informatik.uni-stuttgart.de
Tue Nov 28 10:30:44 MET 1995