Artificial Neural Networks/Hopfield Networks

Hopfield Networks edit

Hopfield networks are one of the oldest and simplest networks. Hopfield networks utilize a network energy function. The activation function of a binary Hopfield network is given by the signum function of a biased weighted sum:

 

Hopfield networks are frequently binary-valued, although continuous variants do exist. Binary networks are useful for classification and clustering purposes.

Energy Function edit

The energy function for the network is given as:

 

Here, the y parameters are the outputs of the ith and jth units. During training the network energy should decrease until it reaches a minimum. This minimum is known as the attractor of the network. As a Hopfield network progresses, the energy minimizes itself. This means that mathematical minimization or optimization problems can be solved automatically by the Hopfield network if that problem can be formulated in terms of the network energy.

Associative Memory edit

Hopfield networks can be used as an associative memory network for data storage purposes. Each attractor represents a different data value that is stored in the network, and a range of associated patterns can be used to retrieve the data pattern. The number of distinct patterns p that can be stored in such a network is given approximately as:

 

Where n is the number of neurons in the network.