What is a Self Organizing Neural Network (SONN)?

Neural Network

Self-Organizing Neural Network (SONN) is an unsupervised learning model in an Artificial Neural Network, commonly referred to as Self-Organizing Feature Maps or Kohonen Maps. You can create this feature mapping by a two-dimensional discretization of an input space during the model’s training (based on competitive learning). This behavior is similar to that seen in biological systems, and this is where the neural network name comes from.

In the human cortex, multi-dimensional sensory input spaces (e.g., auditory, motor, tactile, visual, somatosensory, etc.) are represented by two-dimensional maps. This projection of high dimensional inputs to lower dimensional maps is known as topology conservation, which can be achieved using Self-Organizing Networks.

Why is a SONN necessary?

These Self-Organizing Maps are used to classify and visualize higher-dimensional data in a lower dimension.

The architecture of a SONN

  • Layers: SONN consists of two layers: an input layer that is fully connected and an output (map) layer, known as the Kohonen Layer.
  • Intralayer Connections: All the neurons in the output layer are connected in a local neighborhood with a specific topology. These unweighted lateral connections are responsible for competitive learning.
  • Lateral Feedback Connections: These connections generate excitatory and inhibitory effects based on the distance.

The Phases of SONN

1. Learning Phase

This phase is used to construct the network maps and requires a competitive process with training samples.

2. Prediction Phase

This phase is used to classify new data and provides a specific location on the converged map.

Steps of the SONN Algorithm

The SONN Algorithm can be simplified in 4 easy steps:

1. Initialization: Initialize the Weights of neurons in the map layer.

2. Competitive process: Select one input sample and identify the best matching unit (BMU) among all neurons in an n x m grid using distance measures.

3. Cooperative process: Find the proximity neurons of the BMU by a neighborhood function.

4. Adaptation process: Shift the weights of the BMU and its neighbors towards the input pattern. The process is complete if the maximum number of training iterations has been reached. Otherwise, increment the iteration count and repeat the process from step two.

Disadvantages of SONN

SONN has tremendous advantages. However, this comes at a cost. Below are some of the disadvantages of the SONN:

  • Kohonen Network underperforms on categorical data, especially those with mixed types.
  • SONN cannot develop a generative model for the data.
  • Model cannot be trained against slowly evolving data.
  • The learning phase is time-consuming.


Leave a Reply

Your email address will not be published. Required fields are marked *