Last Updated : 26 Jun, 2025
A Self Organizing Map (SOM) or Kohonen Map is an unsupervised neural network algorithm based on biological neural models from the 1970s. It uses a competitive learning approach and is primarily designed for clustering and dimensionality reduction. SOM effectively maps high-dimensional data to a lower-dimensional grid enabling easier interpretation and visualization of complex datasets.
It consists of two primary layers:
Here are the step by step explanation of its working:
1. InitializationThe weights of the output neurons are randomly initialized. These weights represent the features of each neuron and will be adjusted during training.
2. CompetitionFor each input vector, SOM computes the Euclidean distance between the input and the weight vectors of all neurons. The neuron with the smallest distance is the winning neuron.
Formula : D(j) = \sum_{i=1}^{n} (w_{ij} - x_i)^2
where
The winning neuron’s weights are updated to move closer to the input vector. The weights of neighboring neurons are also adjusted but with smaller changes.
Formula: w_{ij}^{(new)} = w_{ij}^{(old)} + \alpha \cdot (x_i - w_{ij}^{(old)})
where
The learning rate α\alphaα decreases over time allowing the map to converge to stable values.
Formula: \alpha(t+1) = 0.5 \cdot \alpha(t)
5. Stopping ConditionThe training stops when the maximum number of epochs is reached or when the weights converge.
Implementation of SOM in PythonNow let’s walk through a Python implementation of the SOM algorithm. The code is divided into blocks for clarity.
1. Importing Required LibrariesWe will use the math library to compute the Euclidean distance between the input vector and the weight vector.
Python
2. Defining the SOM Class
In this class, we define two important functions, winner() to compute the winning neuron by calculating the Euclidean distance between the input and weight vectors of each cluster and update() to update the weight vectors of the winning neuron according to the weight update rule.
Python
class SOM:
def winner(self, weights, sample):
D0 = 0
D1 = 0
for i in range(len(sample)):
D0 += math.pow((sample[i] - weights[0][i]), 2)
D1 += math.pow((sample[i] - weights[1][i]), 2)
return 0 if D0 < D1 else 1
def update(self, weights, sample, J, alpha)
for i in range(len(weights[0])):
weights[J][i] = weights[J][i] + alpha * (sample[i] - weights[J][i])
return weights
3. Defining the Main Function
In this section, we define the training data and initialize the weights. We also specify the number of epochs and the learning rate.
def main():
T = [[1, 1, 0, 0], [0, 0, 0, 1], [1, 0, 0, 0], [0, 0, 1, 1]]
m, n = len(T), len(T[0])
weights = [[0.2, 0.6, 0.5, 0.9], [0.8, 0.4, 0.7, 0.3]]
ob = SOM()
epochs = 3
alpha = 0.5
4. Training the SOM Network
Here, we loop through each training example for the specified number of epochs, compute the winning cluster and update the weights. For each epoch and training sample we:
# Inside the "main" function
for i in range(epochs):
for j in range(m):
sample = T[j]
J = ob.winner(weights, sample)
weights = ob.update(weights, sample, J, alpha)
5. Classifying Test Sample
After training the SOM network, we use a test sample s and classify it into one of the clusters by computing which cluster has the closest weight vector to the input sample. Finally, we print the cluster assignment and the trained weights for each cluster.
Python
# Inside the "Main" function
s = [0, 0, 0, 1]
J = ob.winner(weights, s)
print("Test Sample s belongs to Cluster: ", J)
print("Trained weights: ", weights)
6. Running the Main Function
The following block runs the main() function when the script is executed.
Python
if __name__ == "__main__":
main()
Output:
The output will display which cluster the test sample belongs to and the final trained weights of the clusters.
Test Sample s belongs to Cluster: 0
Trained weights: [[0.6000000000000001, 0.8, 0.5, 0.9], [0.3333984375, 0.0666015625, 0.7, 0.3]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4