Artificial Neural Networks/Hebbian Learning

Hebbian LearningEdit

Hebbian learning is one of the oldest learning algorithms, and is based in large part on the dynamics of biological systems. A synapse between two neurons is strengthened when the neurons on either side of the synapse (input and output) have highly correlated outputs. In essence, when an input neuron fires, if it frequently leads to the firing of the output neuron, the synapse is strengthened. Following the analogy to an artificial system, the tap weight is increased with high correlation between two sequential neurons.

Mathematical FormulationEdit

Mathematically, we can describe Hebbian learning as:

w_{ij}[n + 1] = w_{ij}[n] + \eta x_i[n]x_j[n]

Here, η is a learning rate coefficient, and x are the outputs of the ith and jth elements.

PlausibilityEdit

The Hebbian learning algorithm is performed locally, and doesn’t take into account the overall system input-output characteristic. This makes it a plausible theory for biological learning methods, and also makes Hebbian learning processes ideal in VLSI hardware implementations where local signals are easier to obtain.

Last modified on 1 September 2011, at 10:47