Parametrical Neural Networks and Some Other Similar Architectures

Aug 18, 2006

Leonid B. Litinskii

From the Russian Academy of Sciences, Institute of Optical Neural Technologies, Moscow, the works on associative neural networks within the last four years are compiled and given a thorough review. The presentation of the report is pivoted on the description of parametrical neural networks (PNN).

PNNs currently hold record recognising characteristics which include storage capacity, noise immunity, and speed of operation. The blog emphasizes the basic principles and ideas of parametrical neural networks.

Introduction

The Hopfield Model (HM) is well acknowledged as a binary auto-associative neural network. It can retrieve binary N-dimensional input patterns when distorted copies are given. However, the storage capacity of HM is relatively small. This makes HM futile for practical applications. To tackle this issue, attempts were made in the last decade of the 80s to improve the recognising characteristics of auto-associative memory looking at q-nary patterns - patterns whose coordinates can take different values greater than 2.

The Potts-glass neural network (PG), having large storage capacity, is an interesting architecture for practical applications. Unfortunately, the statistical physics approach does not provide an explanation for the large storage capacity of the Potts-glass neural network.

Parametrical Neuron

An optical model of auto-associative memory was examined. The model was capable of holding and handling data encoded in the form of phase-frequency modulation. In the network, the signals propagate in the form of quasi-monochromatic pulses at different frequencies (q). This model is based on a parametrical neuron - a cubic nonlinear element that transforms and generates pulses in four-wave mixing processes. If the interconnections have a generalised Hebbian form, the storage capacity of the network also exceeds the HM's. The model named as Parametrical Neural Network (PNN).

PNN's Characteristics

Understanding of the mechanism of suppression of internal noise is essential as it guarantees high recognising properties for both architectures - Potts-glass neural network and PNN. Several variants of auto-associative PNN-architectures were suggested including one that currently holds the record storage capacity. Furthermore, a hetero-associative variant of a q-nary neural network was built that exceeds the auto-associative PNN with respect to speed of operations.

Statistical calculations reveal that compared to HM, the dispersion of internal noise is q2 times less making the network more noise immune. A significant increase in the storage capacity is noticed upon the increase in q, which represents the number of different colors in image processing, the neurons are pixels of the screen.

PNN3

From an electronic device development point of view, PNN's variant when the phases of quasi-monochromatic pulses are lacking, is of interest.

In conclusion, parametrical neural networks currently offer some of the best recognising characteristics and methods for improving noise immunity and storage capacity. They offer a promising direction for future research and development in the field of neural networks.

Sign up to AI First Newsletter

Recommended

We use our own cookies as well as third-party cookies on our websites to enhance your experience, analyze our traffic, and for security and marketing. Select "Accept All" to allow them to be used. Read our Cookie Policy.