TY - JOUR
T1 - Optimizing structural parameters for CMAC (Cerebellar Model Articulation Controller) neural network
AU - Yu, Weiwei
AU - Yan, Jie
AU - Sabourin, C.
AU - Madani, K.
PY - 2008/12
Y1 - 2008/12
N2 - Aim. To our knowledge, there do not exist any papers in the open literature on optimizing structural parameters in order to reduce memory size and save training time. We now present our results on such an optimization study. In the full paper, we explain in some detail our research results; in this abstract, we just add some pertinent remarks to naming the first two sections of the full paper. Section 1 is: CMAC neural network structure. Section 2 is: CMAC neural network structural parameters and some function approximation problems. In subsection 2.1, we study the two structural parameters: Step-length quantization and generalization. Then we discuss how the two parameters influence the approximation quality of the CMAC neural network. In subsection 2.2 we study some function approximation problems and error measurements. In sub-subsection 2.2.1 we give two function approximation examples. In sub-subsection 2.2.2, we calculate the function approximation errors of measurements. Finally we perform computer simulations, whose results are given in Tables 1 through 3 and Figs. 6 and 7. These results show preliminarily that our optimization method can not only much decrease memory size but also save training time.
AB - Aim. To our knowledge, there do not exist any papers in the open literature on optimizing structural parameters in order to reduce memory size and save training time. We now present our results on such an optimization study. In the full paper, we explain in some detail our research results; in this abstract, we just add some pertinent remarks to naming the first two sections of the full paper. Section 1 is: CMAC neural network structure. Section 2 is: CMAC neural network structural parameters and some function approximation problems. In subsection 2.1, we study the two structural parameters: Step-length quantization and generalization. Then we discuss how the two parameters influence the approximation quality of the CMAC neural network. In subsection 2.2 we study some function approximation problems and error measurements. In sub-subsection 2.2.1 we give two function approximation examples. In sub-subsection 2.2.2, we calculate the function approximation errors of measurements. Finally we perform computer simulations, whose results are given in Tables 1 through 3 and Figs. 6 and 7. These results show preliminarily that our optimization method can not only much decrease memory size but also save training time.
KW - CMAC (Cerebellar Model Articulation Controller)
KW - Computer simulation
KW - Function approximation
KW - Generalization parameter
KW - Neural networks
KW - Structure optimization
UR - http://www.scopus.com/inward/record.url?scp=58249143140&partnerID=8YFLogxK
M3 - 文章
AN - SCOPUS:58249143140
SN - 1000-2758
VL - 26
SP - 732
EP - 737
JO - Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University
JF - Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University
IS - 6
ER -