TY - JOUR
T1 - BiLSTM-MLAM
T2 - A Multi-Scale Time Series Prediction Model for Sensor Data Based on Bi-LSTM and Local Attention Mechanisms
AU - Fan, Yongxin
AU - Tang, Qian
AU - Guo, Yangming
AU - Wei, Yifei
N1 - Publisher Copyright:
© 2024 by the authors.
PY - 2024/6
Y1 - 2024/6
N2 - This paper introduces BiLSTM-MLAM, a novel multi-scale time series prediction model. Initially, the approach utilizes bidirectional long short-term memory to capture information from both forward and backward directions in time series data. Subsequently, a multi-scale patch segmentation module generates various long sequences composed of equal-length segments, enabling the model to capture data patterns across multiple time scales by adjusting segment lengths. Finally, the local attention mechanism enhances feature extraction by accurately identifying and weighting important time segments, thereby strengthening the model’s understanding of the local features of the time series, followed by feature fusion. The model demonstrates outstanding performance in time series prediction tasks by effectively capturing sequence information across various time scales. Experimental validation illustrates the superior performance of BiLSTM-MLAM compared to six baseline methods across multiple datasets. When predicting the remaining life of aircraft engines, BiLSTM-MLAM outperforms the best baseline model by 6.66% in RMSE and 11.50% in MAE. In the LTE dataset, it achieves RMSE improvements of 12.77% and MAE enhancements of 3.06%, while in the load dataset, it demonstrates RMSE enhancements of 17.96% and MAE improvements of 30.39%. Additionally, ablation experiments confirm the positive impact of each module on prediction accuracy. Through segment length parameter tuning experiments, combining different segment lengths has resulted in lower prediction errors, affirming the effectiveness of the multi-scale fusion strategy in enhancing prediction accuracy by integrating information from multiple time scales.
AB - This paper introduces BiLSTM-MLAM, a novel multi-scale time series prediction model. Initially, the approach utilizes bidirectional long short-term memory to capture information from both forward and backward directions in time series data. Subsequently, a multi-scale patch segmentation module generates various long sequences composed of equal-length segments, enabling the model to capture data patterns across multiple time scales by adjusting segment lengths. Finally, the local attention mechanism enhances feature extraction by accurately identifying and weighting important time segments, thereby strengthening the model’s understanding of the local features of the time series, followed by feature fusion. The model demonstrates outstanding performance in time series prediction tasks by effectively capturing sequence information across various time scales. Experimental validation illustrates the superior performance of BiLSTM-MLAM compared to six baseline methods across multiple datasets. When predicting the remaining life of aircraft engines, BiLSTM-MLAM outperforms the best baseline model by 6.66% in RMSE and 11.50% in MAE. In the LTE dataset, it achieves RMSE improvements of 12.77% and MAE enhancements of 3.06%, while in the load dataset, it demonstrates RMSE enhancements of 17.96% and MAE improvements of 30.39%. Additionally, ablation experiments confirm the positive impact of each module on prediction accuracy. Through segment length parameter tuning experiments, combining different segment lengths has resulted in lower prediction errors, affirming the effectiveness of the multi-scale fusion strategy in enhancing prediction accuracy by integrating information from multiple time scales.
KW - bidirectional long short-term memory
KW - local attention mechanism
KW - multi-scale patch segmentation
KW - time series prediction
UR - http://www.scopus.com/inward/record.url?scp=85197147069&partnerID=8YFLogxK
U2 - 10.3390/s24123962
DO - 10.3390/s24123962
M3 - 文章
C2 - 38931746
AN - SCOPUS:85197147069
SN - 1424-8220
VL - 24
JO - Sensors
JF - Sensors
IS - 12
M1 - 3962
ER -