TY - JOUR
T1 - TFD-Net
T2 - Transformer Deviation Network for Weakly Supervised Anomaly Detection
AU - Gan, Hongping
AU - Zheng, Hejie
AU - Wu, Zhangfa
AU - Ma, Chunyan
AU - Liu, Jie
N1 - Publisher Copyright:
© 2004-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - Deep Learning (DL)-based weakly supervised anomaly detection methods enhance the security and performance of communication and networks by promptly identifying and addressing anomalies within imbalanced samples, thus ensuring reliable communication and smooth network operations. However, existing DL-based methods often overly emphasize the local feature representations of samples, thereby neglecting the long-range dependencies and the prior knowledge of the samples, which imposes potential limitations on anomaly detection with a limited number of abnormal samples. To mitigate these challenges, we propose a Transformer deviation network for weakly supervised anomaly detection, called TFD-Net, which can effectively leverage the interdependencies and data priors of samples, yielding enhanced anomaly detection performance. Specifically, we first use a Transformer-based feature extraction module that proficiently captures the dependencies of global features in the samples. Subsequently, TFD-Net employs an anomaly score generation module to obtain corresponding anomaly scores. Finally, we introduce an innovative loss function for TFD-Net, named Transformer Deviation Loss Function (TFD-Loss), which can adequately incorporate prior knowledge of samples into the network training process, addressing the issue of imbalanced samples, and thereby enhancing the detection efficiency. Experimental results on public benchmark datasets demonstrate that TFD-Net substantially outperforms other DL-based methods in weakly supervised anomaly detection task.
AB - Deep Learning (DL)-based weakly supervised anomaly detection methods enhance the security and performance of communication and networks by promptly identifying and addressing anomalies within imbalanced samples, thus ensuring reliable communication and smooth network operations. However, existing DL-based methods often overly emphasize the local feature representations of samples, thereby neglecting the long-range dependencies and the prior knowledge of the samples, which imposes potential limitations on anomaly detection with a limited number of abnormal samples. To mitigate these challenges, we propose a Transformer deviation network for weakly supervised anomaly detection, called TFD-Net, which can effectively leverage the interdependencies and data priors of samples, yielding enhanced anomaly detection performance. Specifically, we first use a Transformer-based feature extraction module that proficiently captures the dependencies of global features in the samples. Subsequently, TFD-Net employs an anomaly score generation module to obtain corresponding anomaly scores. Finally, we introduce an innovative loss function for TFD-Net, named Transformer Deviation Loss Function (TFD-Loss), which can adequately incorporate prior knowledge of samples into the network training process, addressing the issue of imbalanced samples, and thereby enhancing the detection efficiency. Experimental results on public benchmark datasets demonstrate that TFD-Net substantially outperforms other DL-based methods in weakly supervised anomaly detection task.
KW - TFD-Loss
KW - Transformer
KW - Weakly supervised anomaly detection
KW - imbalanced samples
UR - http://www.scopus.com/inward/record.url?scp=85207448408&partnerID=8YFLogxK
U2 - 10.1109/TNSM.2024.3485545
DO - 10.1109/TNSM.2024.3485545
M3 - 文章
AN - SCOPUS:85207448408
SN - 1932-4537
JO - IEEE Transactions on Network and Service Management
JF - IEEE Transactions on Network and Service Management
ER -