Application of a Multimodal Deep Learning Model Based on Recursive Fusion Feature Map With Transformer–TCN for Complex Fault Diagnosis of Flying Wing UAV Actuators

Wenqi Zhang, Zhenbao Liu, Zhen Jia, Xiao Wang, Weijun Yan, Kai Wang

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

This present article proposes a multimodal deep learning model based on recursive feature fusion map (RFFM) and Transformer-temporal convolutional network (TCN), to solve the problem of fault diagnosis of flying wing unmanned aerial vehicle (UAV) actuators under complex working conditions. The challenges of data scarcity, high-dimensional nonlinear dynamic characteristics, and the difficulty of effectively fusing multimodal features are particularly relevant in this context. The recursive fusion feature map is used to extract high-order features from multimodal time-series signals (position, velocity, acceleration, torque, current, and voltage), and the Transformer is used to capture the long-term dependencies of the signals, while the TCN is employed to model short-term dynamic characteristics. This enables accurate classification and health assessment of the flying wing UAV actuator under normal, single fault (wear/jamming, dynamic lag, and signal failure), and compound fault modes. In the simulation experiment, the six-modal time-series signal generated for the seven states was analyzed. The experimental findings demonstrated that the proposed model attained a classification accuracy of 98.6% on the balanced dataset and 94.7% on the unbalanced dataset, with an F1 -score exceeding 0.92 for each category. Concurrently, the model’s resilience to complex fault modes was substantiated through a comparison of residual signals and an examination of time-frequency diagrams. A comparison of the model with traditional methods such as support vector machine (SVM), random forest (RF), and long short-term memory (LSTM) reveals significant advantages in key indicators such as average area under curve (AUC) value, diagnostic accuracy, and classification stability (average AUC value exceeds 0.97). The research results demonstrate the effectiveness and applicability of this method in the complex fault diagnosis of flying wing UAV actuators and have important engineering application value and potential for promotion.

Original languageEnglish
Article number3534917
JournalIEEE Transactions on Instrumentation and Measurement
Volume74
DOIs
StatePublished - 2025

Keywords

  • Fault diagnosis
  • Transformer-temporal convolutional network (TCN)
  • flying wing unmanned aerial vehicle (UAV) actuators
  • multimodal deep learning
  • recursive fusion feature maps
  • small sample learning
  • timing signals

Fingerprint

Dive into the research topics of 'Application of a Multimodal Deep Learning Model Based on Recursive Fusion Feature Map With Transformer–TCN for Complex Fault Diagnosis of Flying Wing UAV Actuators'. Together they form a unique fingerprint.

Cite this