TY - JOUR
T1 - RGP
T2 - Neural Network Pruning Through Regular Graph with Edges Swapping
AU - Chen, Zhuangzhi
AU - Xiang, Jingyang
AU - Lu, Yao
AU - Xuan, Qi
AU - Wang, Zhen
AU - Chen, Guanrong
AU - Yang, Xiaoniu
N1 - Publisher Copyright:
© 2023 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
PY - 2024
Y1 - 2024
N2 - Deep learning technology has found a promising application in lightweight model design, for which pruning is an effective means of achieving a large reduction in both model parameters and float points operations (FLOPs). The existing neural network pruning methods mostly start from the consideration of the importance of model parameters and design parameter evaluation metrics to perform parameter pruning iteratively. These methods were not studied from the perspective of network model topology, so they might be effective but not efficient, and they require completely different pruning for different datasets. In this article, we study the graph structure of the neural network and propose a regular graph pruning (RGP) method to perform a one-shot neural network pruning. Specifically, we first generate a regular graph and set its node-degree values to meet the preset pruning ratio. Then, we reduce the average shortest path-length (ASPL) of the graph by swapping edges to obtain the optimal edge distribution. Finally, we map the obtained graph to a neural network structure to realize pruning. Our experiments demonstrate that the ASPL of the graph is negatively correlated with the classification accuracy of the neural network and that RGP has a strong precision retention capability with high parameter reduction (more than 90%) and FLOPs reduction (more than 90%) (the code for quick use and reproduction is available at https://github.com/Holidays1999/Neural-Network-Pruning-through-its-RegularGraph-Structure).
AB - Deep learning technology has found a promising application in lightweight model design, for which pruning is an effective means of achieving a large reduction in both model parameters and float points operations (FLOPs). The existing neural network pruning methods mostly start from the consideration of the importance of model parameters and design parameter evaluation metrics to perform parameter pruning iteratively. These methods were not studied from the perspective of network model topology, so they might be effective but not efficient, and they require completely different pruning for different datasets. In this article, we study the graph structure of the neural network and propose a regular graph pruning (RGP) method to perform a one-shot neural network pruning. Specifically, we first generate a regular graph and set its node-degree values to meet the preset pruning ratio. Then, we reduce the average shortest path-length (ASPL) of the graph by swapping edges to obtain the optimal edge distribution. Finally, we map the obtained graph to a neural network structure to realize pruning. Our experiments demonstrate that the ASPL of the graph is negatively correlated with the classification accuracy of the neural network and that RGP has a strong precision retention capability with high parameter reduction (more than 90%) and FLOPs reduction (more than 90%) (the code for quick use and reproduction is available at https://github.com/Holidays1999/Neural-Network-Pruning-through-its-RegularGraph-Structure).
KW - Convolutional neural network
KW - deep learning
KW - network architecture optimization
KW - neural network pruning
UR - http://www.scopus.com/inward/record.url?scp=85162716149&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2023.3280899
DO - 10.1109/TNNLS.2023.3280899
M3 - 文章
C2 - 37310824
AN - SCOPUS:85162716149
SN - 2162-237X
VL - 35
SP - 14671
EP - 14683
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 10
ER -