Obtaining invariant result between data and its variants under some kinds oftransformation is a useful property in machine learning and computer vision.Previous researchers have empirically shown that Deep Belief Network (DBN)has some degree of invariance, but it still lacks a sound theoretical explanation. In thispaper, we study the invariance of Restricted Boltzmann Machine (RBM), which is thebuilding block of DBN, from its stationary distribution via Information Geometry (IG)theory. This is different from previous works which focused on the state of latentvariables (as features) in the hidden layer of RBM.In this paper, we propose an invariance measurement via Information Geometrytheory. Based on this measurement, we show theoretically and empirically that singlelayer Boltzmann Machine (SBM) has invariance when data and its variants are similarin local information. We also empirically show that RBM has better invariance degreecomparing with SBM. We expect our results can inspire future explanation for theinvariance of DBN. |