TY - GEN
T1 - A Multi-objective Particle Swarm Optimization for Neural Networks Pruning
AU - Wu, Tao
AU - Shi, Jiao
AU - Zhou, Deyun
AU - Lei, Yu
AU - Gong, Maoguo
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/6
Y1 - 2019/6
N2 - There is a ruling maxim in deep learning land, bigger is better. However, bigger neural network provides higher performance but also expensive computation, memory and energy. The simplified model which preserves the accuracy of original network arouses a growing interest. A simple yet efficient method is pruning, which cuts off unimportant synapses and neurons. Therefore, it is crucial to identify important parts from the given numerous connections. In this paper, we use the evolutionary pruning method to simplify the structure of deep neural networks. A multi-objective neural networks pruning model which balances the accuracy and the sparse ratio of networks is proposed and we solve this model with particle swarm optimization (PSO) method. Furthermore, we fine-tune the network which is obtained by pruning to obtain better pruning result. The framework of alternate pruning and fine-tuning operations is used to achieve more prominent pruning effect. In experimental studies, we prune LeNet on MNIST and shallow VGGNet on CIFAR-10. Experimental results demonstrate that our method could prune over 80% weights in general with no loss of accuracy.
AB - There is a ruling maxim in deep learning land, bigger is better. However, bigger neural network provides higher performance but also expensive computation, memory and energy. The simplified model which preserves the accuracy of original network arouses a growing interest. A simple yet efficient method is pruning, which cuts off unimportant synapses and neurons. Therefore, it is crucial to identify important parts from the given numerous connections. In this paper, we use the evolutionary pruning method to simplify the structure of deep neural networks. A multi-objective neural networks pruning model which balances the accuracy and the sparse ratio of networks is proposed and we solve this model with particle swarm optimization (PSO) method. Furthermore, we fine-tune the network which is obtained by pruning to obtain better pruning result. The framework of alternate pruning and fine-tuning operations is used to achieve more prominent pruning effect. In experimental studies, we prune LeNet on MNIST and shallow VGGNet on CIFAR-10. Experimental results demonstrate that our method could prune over 80% weights in general with no loss of accuracy.
UR - http://www.scopus.com/inward/record.url?scp=85071335568&partnerID=8YFLogxK
U2 - 10.1109/CEC.2019.8790145
DO - 10.1109/CEC.2019.8790145
M3 - 会议稿件
AN - SCOPUS:85071335568
T3 - 2019 IEEE Congress on Evolutionary Computation, CEC 2019 - Proceedings
SP - 570
EP - 577
BT - 2019 IEEE Congress on Evolutionary Computation, CEC 2019 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 IEEE Congress on Evolutionary Computation, CEC 2019
Y2 - 10 June 2019 through 13 June 2019
ER -