Restricted Boltzmann Machines
Learn how simple networks can “learn” the distribution of image data and serve as building blocks for larger networks.
The neural network model that we will apply to the MNIST data has its origins in earlier research on how neurons in the mammalian brain might work together to transmit signals and encode patterns as memories. As we discussed earlier, Hebbian Learning states, “Neurons that fire together, wire
One of these models was the Hopfield
The neurons in the Hopfield network take on binary values, either
The threshold values (sigma) never change during training; to update the weights, a Hebbian approach is to use a set of
Where
Besides representing biological memory, Hopfield networks also have an interesting parallel to electromagnetism. If we consider each neuron as a particle or charge, we can describe the model in terms of a free energy equation that represents how the particles in this system mutually repulse/attract each other and where on the distribution of potential configurations the system lies relative to equilibrium:
Where
Get hands-on with 1400+ tech skills courses.