TY - JOUR
T1 - Transforming Motor Imagery Analysis
T2 - A Novel EEG Classification Framework Using AtSiftNet Method
AU - Xu, Haiqin
AU - Haider, Waseem
AU - Aziz, Muhammad Zulkifal
AU - Sun, Youchao
AU - Yu, Xiaojun
N1 - Publisher Copyright:
© 2024 by the authors.
PY - 2024/10
Y1 - 2024/10
N2 - This paper presents an innovative approach for the Feature Extraction method using Self-Attention, incorporating various Feature Selection techniques known as the AtSiftNet method to enhance the classification performance of motor imaginary activities using electrophotography (EEG) signals. Initially, the EEG signals were sorted and then denoised using multiscale principal component analysis to obtain clean EEG signals. However, we also conducted a non-denoised experiment. Subsequently, the clean EEG signals underwent the Self-Attention feature extraction method to compute the features of each trial (i.e., (Formula presented.)). The best 1 or 15 features were then extracted through eight different feature selection techniques. Finally, five different machine learning and neural network classification models were employed to calculate the accuracy, sensitivity, and specificity of this approach. The BCI competition III dataset IV-a was utilized for all experiments, encompassing the datasets of five volunteers who participated in the competition. The experiment findings reveal that the average accuracy of classification is highest among ReliefF (i.e., (Formula presented.)), Mutual Information (i.e., (Formula presented.)), Independent Component Analysis (i.e., (Formula presented.)), and Principal Component Analysis (i.e., (Formula presented.)) for both 1 and 15 best-selected features from each trial. These accuracies were obtained for motor imagery using a Support Vector Machine (SVM) as a classifier. In addition, five-fold validation was performed in this paper to assess the fair performance estimation and robustness of the model. The average accuracy obtained through five-fold validation is (Formula presented.). The experiments’ findings indicate that the suggested framework provides a resilient biomarker with minimal computational complexity, making it a suitable choice for advancing Motor Imagery Brain–Computer Interfaces (BCI).
AB - This paper presents an innovative approach for the Feature Extraction method using Self-Attention, incorporating various Feature Selection techniques known as the AtSiftNet method to enhance the classification performance of motor imaginary activities using electrophotography (EEG) signals. Initially, the EEG signals were sorted and then denoised using multiscale principal component analysis to obtain clean EEG signals. However, we also conducted a non-denoised experiment. Subsequently, the clean EEG signals underwent the Self-Attention feature extraction method to compute the features of each trial (i.e., (Formula presented.)). The best 1 or 15 features were then extracted through eight different feature selection techniques. Finally, five different machine learning and neural network classification models were employed to calculate the accuracy, sensitivity, and specificity of this approach. The BCI competition III dataset IV-a was utilized for all experiments, encompassing the datasets of five volunteers who participated in the competition. The experiment findings reveal that the average accuracy of classification is highest among ReliefF (i.e., (Formula presented.)), Mutual Information (i.e., (Formula presented.)), Independent Component Analysis (i.e., (Formula presented.)), and Principal Component Analysis (i.e., (Formula presented.)) for both 1 and 15 best-selected features from each trial. These accuracies were obtained for motor imagery using a Support Vector Machine (SVM) as a classifier. In addition, five-fold validation was performed in this paper to assess the fair performance estimation and robustness of the model. The average accuracy obtained through five-fold validation is (Formula presented.). The experiments’ findings indicate that the suggested framework provides a resilient biomarker with minimal computational complexity, making it a suitable choice for advancing Motor Imagery Brain–Computer Interfaces (BCI).
KW - attention sift network (AtSiftNet)
KW - brain–computer interface (BCI)
KW - independent component analysis (ICA)
KW - motor imagery (MI)
KW - principal component analysis (PCA)
UR - http://www.scopus.com/inward/record.url?scp=85206472080&partnerID=8YFLogxK
U2 - 10.3390/s24196466
DO - 10.3390/s24196466
M3 - 文章
C2 - 39409506
AN - SCOPUS:85206472080
SN - 1424-8220
VL - 24
JO - Sensors
JF - Sensors
IS - 19
M1 - 6466
ER -