TY - JOUR
T1 - Vulnerability detection based on transformer and high-quality number embedding
AU - Cao, Yang
AU - Dong, Yunwei
AU - Peng, Jiajie
N1 - Publisher Copyright:
© 2024 John Wiley & Sons Ltd.
PY - 2024/12/25
Y1 - 2024/12/25
N2 - Software vulnerability detection is an important problem in software security. In recent years, deep learning offers a novel approach for source code vulnerability detection. Due to the similarities between programming languages and natural languages, many natural language processing techniques have been applied to vulnerability detection tasks. However, specific problems within vulnerability detection tasks, such as buffer overflow, involve numerical reasoning. For these problems, the model needs to not only consider long dependencies and multiple relationships between statements of code but also capture the magnitude property of numerical literals in the program through high-quality number embeddings. Therefore, we propose VDTransformer, a Transformer-based method that improves source code embedding by integrating word and number embeddings. Furthermore, we employ Transformer encoders to construct a hierarchical neural network that extracts semantic features from the code and enables line-level vulnerability detection. To evaluate the effectiveness of the method, we construct a dataset named OverflowGen based on templates for buffer overflow. Experimental comparisons on OverflowGen with a well-known static vulnerability detector and two state-of-the-art deep learning-based methods confirm the effectiveness of VDTransformer and the importance of high-quality number embeddings in vulnerability detection tasks involving numerical features.
AB - Software vulnerability detection is an important problem in software security. In recent years, deep learning offers a novel approach for source code vulnerability detection. Due to the similarities between programming languages and natural languages, many natural language processing techniques have been applied to vulnerability detection tasks. However, specific problems within vulnerability detection tasks, such as buffer overflow, involve numerical reasoning. For these problems, the model needs to not only consider long dependencies and multiple relationships between statements of code but also capture the magnitude property of numerical literals in the program through high-quality number embeddings. Therefore, we propose VDTransformer, a Transformer-based method that improves source code embedding by integrating word and number embeddings. Furthermore, we employ Transformer encoders to construct a hierarchical neural network that extracts semantic features from the code and enables line-level vulnerability detection. To evaluate the effectiveness of the method, we construct a dataset named OverflowGen based on templates for buffer overflow. Experimental comparisons on OverflowGen with a well-known static vulnerability detector and two state-of-the-art deep learning-based methods confirm the effectiveness of VDTransformer and the importance of high-quality number embeddings in vulnerability detection tasks involving numerical features.
KW - code joint embedding
KW - transformer
KW - vulnerability detection
UR - http://www.scopus.com/inward/record.url?scp=85204528048&partnerID=8YFLogxK
U2 - 10.1002/cpe.8292
DO - 10.1002/cpe.8292
M3 - 文章
AN - SCOPUS:85204528048
SN - 1532-0626
VL - 36
JO - Concurrency and Computation: Practice and Experience
JF - Concurrency and Computation: Practice and Experience
IS - 28
M1 - e8292
ER -