TY - JOUR
T1 - A Novel Restricted Boltzmann Machine Training Algorithm with Fast Gibbs Sampling Policy
AU - Wang, Qianglong
AU - Gao, Xiaoguang
AU - Wan, Kaifang
AU - Li, Fei
AU - Hu, Zijian
N1 - Publisher Copyright:
© 2020 Qianglong Wang et al.
PY - 2020
Y1 - 2020
N2 - The restricted Boltzmann machine (RBM) is one of the widely used basic models in the field of deep learning. Although many indexes are available for evaluating the advantages of RBM training algorithms, the classification accuracy is the most convincing index that can most effectively reflect its advantages. RBM training algorithms are sampling algorithms essentially based on Gibbs sampling. Studies focused on algorithmic improvements have mainly faced challenges in improving the classification accuracy of the RBM training algorithms. To address the above problem, in this paper, we propose a fast Gibbs sampling (FGS) algorithm to learn the RBM by adding accelerated weights and adjustment coefficient. An important link based on Gibbs sampling theory was established between the update of the network weights and mixing rate of Gibbs sampling chain. The proposed FGS method was used to accelerate the mixing rate of Gibbs sampling chain by adding accelerated weights and adjustment coefficients. To further validate the FGS method, numerous experiments were performed to facilitate comparisons with the classical RBM algorithm. The experiments involved learning the RBM based on standard data. The results showed that the proposed FGS method outperformed the CD, PCD, PT5, PT10, and DGS algorithms, particularly with respect to the handwriting database. The findings of our study suggest the potential applications of FGS to real-world problems and demonstrate that the proposed method can build an improved RBM for classification.
AB - The restricted Boltzmann machine (RBM) is one of the widely used basic models in the field of deep learning. Although many indexes are available for evaluating the advantages of RBM training algorithms, the classification accuracy is the most convincing index that can most effectively reflect its advantages. RBM training algorithms are sampling algorithms essentially based on Gibbs sampling. Studies focused on algorithmic improvements have mainly faced challenges in improving the classification accuracy of the RBM training algorithms. To address the above problem, in this paper, we propose a fast Gibbs sampling (FGS) algorithm to learn the RBM by adding accelerated weights and adjustment coefficient. An important link based on Gibbs sampling theory was established between the update of the network weights and mixing rate of Gibbs sampling chain. The proposed FGS method was used to accelerate the mixing rate of Gibbs sampling chain by adding accelerated weights and adjustment coefficients. To further validate the FGS method, numerous experiments were performed to facilitate comparisons with the classical RBM algorithm. The experiments involved learning the RBM based on standard data. The results showed that the proposed FGS method outperformed the CD, PCD, PT5, PT10, and DGS algorithms, particularly with respect to the handwriting database. The findings of our study suggest the potential applications of FGS to real-world problems and demonstrate that the proposed method can build an improved RBM for classification.
UR - http://www.scopus.com/inward/record.url?scp=85082801783&partnerID=8YFLogxK
U2 - 10.1155/2020/4206457
DO - 10.1155/2020/4206457
M3 - 文章
AN - SCOPUS:85082801783
SN - 1024-123X
VL - 2020
JO - Mathematical Problems in Engineering
JF - Mathematical Problems in Engineering
M1 - 4206457
ER -