next up previous contents
Next: Theory of the Network Up: Bidirectional Associative Memory Previous: Network Operation

The Network Equations

The weights are determined from a set of exemplar pattern pairs. If there are P pairs, $({\bf a}_1,{\bf b}_1),({\bf a}_2,{\bf b}_2),\ldots,({\bf 
a}_P,{\bf b}_P)$,where ${\bf a}_k=(a_k^1,\ldots,a_k^{N_A})$ and ${\bf b}_i=(b_k^1,\ldots,b_k^{N_B})$ and $a_k^i=\pm 
1, b_k^j=\pm 1$ for $1 \leq i \leq N_A, 1 \leq j \leq N_B, 1 \leq 
k \leq P$, then the weights are given by $w_{ij} = \sum_{k=1}^P a_k^i b_k^j, 
1 \leq i \leq N_A,
1 \leq j \leq N_B$.

Let the initial states of the units in the A layer be set to $\mu_i(0)$ for $1 \leq i 
\leq N_A$. At time t > 0 we compute new states for the units in the B layer by computing the activations $\beta_j(t) = \sum_{i=1}^{N_A} w_{ij} \mu_i(t-1)$ and then setting the new states of the units to $\nu_j(t) = 1$ if $\beta_j(t) 
\gt 0$,$\nu_j(t) = \nu_j(t-1)$ if $\beta_j(t) = 0$ and

$\nu_j(t) = -1$ if $\beta_j(t) < 0$, $1 \leq j \leq N_B$.

We then compute new states for the units in the A layer by computing the activations $\alpha_i(t) = \sum_{j=1}^{N_B} w_{ij} 
\nu_j(t)$ and then setting the new states of the units to $\mu_i(t) = 1 $ if $\alpha_i(t) \gt 0$,$\mu_i(t) = \mu_i(t-1)$ if $\alpha_i(t) = 0$,and $\mu_i(t) = -1 $ if $\alpha_i(t) < 0$, $1 \leq i 
\leq N_A$.

New states for the units in each layer are computed in alternation until a steady state is reached, which represents the output of the network.



Mike Alder
9/19/1997