ArtEEGAttention: an advanced deep learning approach for art brain decoding

Research output: Contribution to journalArticlepeer-review

Abstract

The capacity to interpret the brain’s processing of visual art via brain imaging techniques provides significant understanding of the cognitive mechanisms behind aesthetic appreciation. This study investigates these mechanisms through analyzing electroencephalography (EEG) data from participants performing two different tasks: gazing at a blank wall and viewing artworks. We created the ArtEEGAttention model, a novel deep learning architecture that employs sliding window convolution and multi-head self-attention mechanisms to accurately identify these varied viewing scenarios. Evaluated on a selected dataset of 16 individuals, with EEG signals separated into 3-second epochs and classified according to viewing environment, our model exhibited outstanding performance, with a remarkable cross-subject accuracy of 77.96%. The model’s remarkable accuracy, especially evident in specific subjects, highlights its robustness and superior generalization skills across various brain responses to art.

Original languageEnglish
Article number2008346
JournalFrontiers of Computer Science
Volume20
Issue number8
DOIs
StatePublished - Aug 2026

Keywords

  • deep learning
  • electroencephalography
  • neuroaesthetics
  • visual art

Fingerprint

Dive into the research topics of 'ArtEEGAttention: an advanced deep learning approach for art brain decoding'. Together they form a unique fingerprint.

Cite this