The Restricted Boltzmann Machine

Learn about the restricted Boltzmann machine and contrastive Hebbian learning.

The restricted Boltzmann machine and contrastive Hebbian learning

Training of the Boltzmann machine with the rule stated in the previous lesson is challenging because the states of the nodes are always changing.

Even with the visible states clamped, the states of the hidden nodes are continuously changing for the following reasons:

  • The update rule is probabilistic, which means that even with the constant activity of the visible nodes, hidden nodes receive variable input.

  • The recurrent connections between hidden nodes can change the states of the hidden nodes rapidly and generate rich dynamics in the system.

We certainly want to keep the probabilistic update rule since we need to generate different responses of the system in response to sensory data. However, we can simplify the system by eliminating recurrent connections within each layer, although connections between the layers are still bi-directional. While the simplification of omitting collateral connections is potentially severe, any of the abilities of general recurrent networks with hidden nodes can be recovered through the use of many layers, which bring back indirect recurrences. A restricted Boltzmann machine (RBM)\textbf{(RBM)} is shown in the figure below:

Get hands-on with 1200+ tech skills courses.