next up previous
Next: Our Neural Network Algorithm Up: Neural Network Algorithms Previous: Neural Networks

Stark's Neural Network Algorithm

Stark's algorithm [3,4] implies the transition from a given IFS F to the corresponding neural network W. To do such a passage, it is enough to consider any one iteration of F.

Let $X \equiv S^u \subset {\rm I\!R}^2$, Su - a unit square digitized with respect to fixed resolutions $\Delta_x$ and $\Delta_y$. Every An has its indicator function, $y: S^u \rightarrow \{0,1\}$, such that:


\begin{displaymath}\forall s \in S^u: \;\; y_s(n)=
\left\{ \begin{array}{ll}
1...
..._n$ } \\
0 & \mbox{if $s \not\in A_n$ }
\end{array} \right.
\end{displaymath} (8)

If we define


\begin{displaymath}w_{ss\prime}=
\left\{ \begin{array}{lll}
1 & \mbox{if $f_i(...
...for some $i$ } \\
0 & {\rm otherwise} &
\end{array} \right.
\end{displaymath} (9)

then the dynamics of ys(n) is given by


\begin{displaymath}y_s(n+1)=g\Bigl(\sum_{s\prime
\in S}w_{ss\prime}y_{s\prime}(n)\Bigr),
\end{displaymath} (10)

where g is a step function.

Thus, we have the binary neural network with |Su| neurons ys and synaptic weights $w_{ss\prime}$ that realizes the IFS F.



IMACS ACA'98 Electronic Proceedings