TY - JOUR
T1 - Joint-Guided Distillation Binary Neural Network via Dynamic Channel-Wise Diversity Enhancement for Object Detection
AU - Xie, Yefan
AU - Hou, Xuan
AU - Guo, Yanwei
AU - Wang, Xiuying
AU - Zheng, Jiangbin
N1 - Publisher Copyright:
© 1991-2012 IEEE.
PY - 2024/1/1
Y1 - 2024/1/1
N2 - Through truncating the weights and activations of a deep neural network, conventional binary quantization imposes limitations on the representation capability of the network parameters, which hence deteriorates the detection performance of the network. In this paper, a joint-guided distillation binary neural network via dynamic channel-wise diversity enhancement for object detection (JDBNet) is proposed to mitigate the gap of quantization errors. Our JDBNet includes a dynamic channel-wise diversity scheme and real-valued joint-guided teacher assistance to enhance the representation capability of the binary neural network in the object detection tasks. In the dynamic diversity scheme, the learning channel-wise bias (LCB) layer supports adjusting the magnitude of the parameters in which the sensitivity of the model parameters to the arbitrary quantization method is reduced, thereby improving the diversity expression ability of the feature parameters. In the joint-guided strategy, the single-precision implicit knowledge from the guiding teacher in the multilevel layer is utilized to supervise and penalize the quantitative model, enhancing the fitting performance of parameters in the binary quantized model. Extensive experiments on the PASCAL VOC, MS COCO, and VisDrone-DET datasets demonstrate that our JDBNet outperforms the state-of-the-art binary object detection networks in terms of mean Average Precision.
AB - Through truncating the weights and activations of a deep neural network, conventional binary quantization imposes limitations on the representation capability of the network parameters, which hence deteriorates the detection performance of the network. In this paper, a joint-guided distillation binary neural network via dynamic channel-wise diversity enhancement for object detection (JDBNet) is proposed to mitigate the gap of quantization errors. Our JDBNet includes a dynamic channel-wise diversity scheme and real-valued joint-guided teacher assistance to enhance the representation capability of the binary neural network in the object detection tasks. In the dynamic diversity scheme, the learning channel-wise bias (LCB) layer supports adjusting the magnitude of the parameters in which the sensitivity of the model parameters to the arbitrary quantization method is reduced, thereby improving the diversity expression ability of the feature parameters. In the joint-guided strategy, the single-precision implicit knowledge from the guiding teacher in the multilevel layer is utilized to supervise and penalize the quantitative model, enhancing the fitting performance of parameters in the binary quantized model. Extensive experiments on the PASCAL VOC, MS COCO, and VisDrone-DET datasets demonstrate that our JDBNet outperforms the state-of-the-art binary object detection networks in terms of mean Average Precision.
KW - Binary neural network
KW - knowledge distillation
KW - object detection
KW - parameter quantization
UR - https://www.scopus.com/pages/publications/85162681958
U2 - 10.1109/TCSVT.2023.3286072
DO - 10.1109/TCSVT.2023.3286072
M3 - 文章
AN - SCOPUS:85162681958
SN - 1051-8215
VL - 34
SP - 448
EP - 460
JO - IEEE Transactions on Circuits and Systems for Video Technology
JF - IEEE Transactions on Circuits and Systems for Video Technology
IS - 1
ER -