What are self-organizing maps (SOMs)?
Self-organizing maps (SOMs) are a type of artificial, unsupervised neural network; this network uses a competitive learning algorithm for its training. These are used to map higher dimensional data onto lower-dimensional data, thus making for an easier understanding of the data due to the reduced complexity. This model consists of two layers: the input and the output layer.
The structure of a simple SOM is provided below. Clusters are marked as V, inputs as X, and weights as W.
Process of selection
- We initialize the weights by randomly picking numbers.
- We select a training example randomly.
- We select the winning vector (shortest distance) through the Euclidean method.
- Then, we update the weights of the vectors.
- We repeat step number 2 until the process of selection is completed.
The formula for Euclidean distance is as follows:
The learning rate at time is denoted here by . The winning vector is denoted by . The and training examples are represented by and respectively. After the completion of the process, a winning vector is selected.
Code example
The Python implementation for the SOM process is provided below:
class SOM:def winner(self, weights, sample): #find winning vectorP0,P1,i = 0,0,0 # i is iterator counterwhile i < (len(sample)):P0 += (sample[i] - weights[0][i])**2P1 += (sample[i] - weights[1][i])**2if P1 < P0: return 0elif P1 > P0: return 1i += 1def update(self, weights, sample, K, alpha): # update winning vectori = 0while i < (len(weights)):weights[K][i] += (sample[i] - weights[K][i]) * alphai += 1return weightsdef main(): # main function# Initialization of weightsweights = [[0.2, 0.6, 0.3, 0.9], [0.5, 0.2, 0.7, 0.3]]# Training ExamplesT = [[1, 1, 1, 0], [0, 1, 0, 1], [0, 0, 0, 1], [0, 1, 0, 1]]ob = SOM() # trainingepochs, alpha = 3, 0.4 # epochs and alpha values initialisedi = 0while i < epochs:i += 1 # increment counterj = 0 # initialise e j as 0while j < len(T):sample = T[j] # training sampleK = ob.winner(weights, sample) # Computing the winner vectorweights = ob.update(weights, sample, K, alpha) # Updating winning vectorj += 1 # increment counter# classifying the test sampleK = ob.winner(weights, [0, 1, 0, 0])print(f"Test sample s belongs to cluster, {K} ","\n")print("The trained weights are ", *weights, sep = "\n")main()
Code explanation
- Line 1: We create a class
SOM. - Line 2–11: We create a function to find the winning vector. We apply the Euclidean distance to all available samples.
- Line 13–18: We create an
updatefunction that updates the weight values. - Line 21-25: We initialize the
weightsand theTarray. - Line 27: We create an object of
SOMasob. - Line 28: We initialize the
epochsandalphavalues. - Line 31–39: We run a nested loop of
epochandm, where we first select a training sample, compute the winning vector, and then finally proceed to update the winning vector in our list. - Line 47: We run the
maindriver.
Free Resources