TY - JOUR
T1 - Device adaptation free-KDA based on multi-teacher knowledge distillation
AU - Yang, Yafang
AU - Guo, Bin
AU - Liang, Yunji
AU - Zhao, Kaixing
AU - Yu, Zhiwen
N1 - Publisher Copyright:
© The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2024.
PY - 2024/10
Y1 - 2024/10
N2 - The keyboard, a major mean of interaction between human and internet devices, should beset right for good performance during authentication task. To guarantee that one legitimate user can interleave or simultaneously interact with two or more devices with protecting user privacy, it is essential to build device adaptation free-text keystroke dynamics authentication (free-KDA) model based on multi-teacher knowledge distillation methods. Some multi-teacher knowledge distillation methods have shown effective in C-way classification task. However, it is unreasonable for free-KDA model, since free-KDA model is one-class classification task. Instead of using soft-label to capture useful knowledge of source for target device, we propose a device adaptation free-KDA model. When one user builds the authentication model for target device with limited training samples, we propose a novel optimization objective by decreasing the distance discrepancy in Euclidean distance and cosine similarity between source and target device. And then, we adopt an adaptive confidence gate strategy to solve different correlation for each user between different source devices and target device. It is verified on two keystroke datasets with different types of keyboards, and compared its performance with the existing dominant multi-teacher knowledge distillation methods. Extensive experimental results demonstrate that AUC of target device reaches up to 95.17%, which is 15.28% superior to state-of-the-art multi-teacher knowledge distillation methods.
AB - The keyboard, a major mean of interaction between human and internet devices, should beset right for good performance during authentication task. To guarantee that one legitimate user can interleave or simultaneously interact with two or more devices with protecting user privacy, it is essential to build device adaptation free-text keystroke dynamics authentication (free-KDA) model based on multi-teacher knowledge distillation methods. Some multi-teacher knowledge distillation methods have shown effective in C-way classification task. However, it is unreasonable for free-KDA model, since free-KDA model is one-class classification task. Instead of using soft-label to capture useful knowledge of source for target device, we propose a device adaptation free-KDA model. When one user builds the authentication model for target device with limited training samples, we propose a novel optimization objective by decreasing the distance discrepancy in Euclidean distance and cosine similarity between source and target device. And then, we adopt an adaptive confidence gate strategy to solve different correlation for each user between different source devices and target device. It is verified on two keystroke datasets with different types of keyboards, and compared its performance with the existing dominant multi-teacher knowledge distillation methods. Extensive experimental results demonstrate that AUC of target device reaches up to 95.17%, which is 15.28% superior to state-of-the-art multi-teacher knowledge distillation methods.
KW - Adaptive confidence gate
KW - Device adaptation
KW - Free-KDA
KW - Multi-teacher knowledge distillation
KW - One-class classification
UR - http://www.scopus.com/inward/record.url?scp=85201021364&partnerID=8YFLogxK
U2 - 10.1007/s12652-024-04836-5
DO - 10.1007/s12652-024-04836-5
M3 - 文章
AN - SCOPUS:85201021364
SN - 1868-5137
VL - 15
SP - 3603
EP - 3615
JO - Journal of Ambient Intelligence and Humanized Computing
JF - Journal of Ambient Intelligence and Humanized Computing
IS - 10
ER -