TY - GEN
T1 - Dynamic Activation Binarization for Accurate Network Compression
AU - Zhao, Tong
AU - Lang, Zhiqiang
AU - Song, Chongxing
AU - Lin, Zenggang
AU - Huang, Yihan
AU - Zhang, Lei
AU - Wei, Wei
N1 - Publisher Copyright:
© 2025 Copyright held by the owner/author(s)
PY - 2025/6/2
Y1 - 2025/6/2
N2 - Binary neural networks (BNNs), of which both weights and activations are binarized into 1 bit, have shown great potential in network compression. However, due to the limited expressive capacity of a single bit, they suffer from severe performance degradation. To mitigate this problem, we present a dynamic activation binarization method for BNNs. In contrast to existing BNN methods which utilizes Sign and ReLU functions with fixed thresholds for activation binarization, we introduce auxiliary controllers to dynamically generate thresholds for each test image based on its content, which empowers the BNN to flexibly exploit the characteristics of each image to achieve optimal activation binarization, thus increasing the expressive capacity of BNN. Moreover, we construct each controller using a light-weight structure, which only slightly increases the computational and storage complexity over the baseline, but improves the performance obviously. Experiments on two standard classification benchmarks verify the superiority of the proposed method over many state-of-the-arts. In addition, the proposed BNN also has excellent performance on other computer vison related tasks, which even obtains comparable performance with a SOTA 4-bit network on super-resolution task.
AB - Binary neural networks (BNNs), of which both weights and activations are binarized into 1 bit, have shown great potential in network compression. However, due to the limited expressive capacity of a single bit, they suffer from severe performance degradation. To mitigate this problem, we present a dynamic activation binarization method for BNNs. In contrast to existing BNN methods which utilizes Sign and ReLU functions with fixed thresholds for activation binarization, we introduce auxiliary controllers to dynamically generate thresholds for each test image based on its content, which empowers the BNN to flexibly exploit the characteristics of each image to achieve optimal activation binarization, thus increasing the expressive capacity of BNN. Moreover, we construct each controller using a light-weight structure, which only slightly increases the computational and storage complexity over the baseline, but improves the performance obviously. Experiments on two standard classification benchmarks verify the superiority of the proposed method over many state-of-the-arts. In addition, the proposed BNN also has excellent performance on other computer vison related tasks, which even obtains comparable performance with a SOTA 4-bit network on super-resolution task.
KW - Binary neural network
KW - Dynamic activation binarization
KW - Image classification
KW - Light-weight model
UR - https://www.scopus.com/pages/publications/105021305230
U2 - 10.1145/3727648.3727786
DO - 10.1145/3727648.3727786
M3 - 会议稿件
AN - SCOPUS:105021305230
T3 - Proceedings of the 4th International Conference on Computer, Artificial Intelligence and Control Engineering, CAICE 2025
SP - 842
EP - 847
BT - Proceedings of the 4th International Conference on Computer, Artificial Intelligence and Control Engineering, CAICE 2025
PB - Association for Computing Machinery, Inc
T2 - 4th International Conference on Computer, Artificial Intelligence and Control Engineering, CAICE 2025
Y2 - 10 January 2025 through 12 January 2025
ER -