Restricted Boltzmann Machines (RBM)

RBMs are an Unsupervised Learning algorithm with shallow, two-layer neural networks without any intra-layer connections that constitute the building blocks of Deep Belief Networks (DBN). They are similar to Boltzmann Machines (BM) with restriction in connecting groups of neurons to another group(instead of Hopfield Networks (HN) like full connection) where their neurons must form a bipartite graph. This restriction makes them more usable.

Tldr

RBM is a Boltzmann Machines (BM) with the restriction that no connections can be made between units within the same layer; only between units in different layers.

RBMs can be trained like Feed-Forward Neural Networks (FFNN) however instead of passing data forward and then back-propagating, you forward pass the data and then backward pass the data (back to the first layer). After that you train with forward-and-back-propagation.


Notes:


Applications: