TY - JOUR
T1 - Advances in attention mechanisms for medical image segmentation
AU - Zhang, Jianpeng
AU - Chen, Xiaomin
AU - Yang, Bing
AU - Guan, Qingbiao
AU - Chen, Qi
AU - Chen, Jian
AU - Wu, Qi
AU - Xie, Yutong
AU - Xia, Yong
N1 - Publisher Copyright:
© 2024
PY - 2025/5
Y1 - 2025/5
N2 - Medical image segmentation plays an important role in computer-aided diagnosis. Attention mechanisms that distinguish important parts from irrelevant parts have been widely used in medical image segmentation tasks. This paper systematically reviews the basic principles of attention mechanisms and their applications in medical image segmentation. First, we review the basic concepts of attention mechanism and formulation. Second, we surveyed about 200 articles related to medical image segmentation, and divided them into three groups based on their attention mechanisms, Pre-Transformer attention, Transformer attention and Mamba-related attention. In each group, we deeply analyze the attention mechanisms from three aspects based on the current literature work, i.e., the principle of the mechanism (what to use), implementation methods (how to use), and application tasks (where to use). We also thoroughly analyzed the advantages and limitations of their applications to different tasks. Finally, we summarize the current state of research and shortcomings in the field, and discuss the potential challenges in the future, including task specificity, robustness, standard evaluation, etc. We hope that this review can showcase the overall research context of traditional, Transformer and Mamba attention methods, provide a clear reference for subsequent research, and inspire more advanced attention research, not only in medical image segmentation, but also in other image analysis scenarios. Finally, we maintain the paper list and open-source code at here.
AB - Medical image segmentation plays an important role in computer-aided diagnosis. Attention mechanisms that distinguish important parts from irrelevant parts have been widely used in medical image segmentation tasks. This paper systematically reviews the basic principles of attention mechanisms and their applications in medical image segmentation. First, we review the basic concepts of attention mechanism and formulation. Second, we surveyed about 200 articles related to medical image segmentation, and divided them into three groups based on their attention mechanisms, Pre-Transformer attention, Transformer attention and Mamba-related attention. In each group, we deeply analyze the attention mechanisms from three aspects based on the current literature work, i.e., the principle of the mechanism (what to use), implementation methods (how to use), and application tasks (where to use). We also thoroughly analyzed the advantages and limitations of their applications to different tasks. Finally, we summarize the current state of research and shortcomings in the field, and discuss the potential challenges in the future, including task specificity, robustness, standard evaluation, etc. We hope that this review can showcase the overall research context of traditional, Transformer and Mamba attention methods, provide a clear reference for subsequent research, and inspire more advanced attention research, not only in medical image segmentation, but also in other image analysis scenarios. Finally, we maintain the paper list and open-source code at here.
KW - Attention mechanism
KW - Deep learning
KW - Mamba
KW - Medical image segmentation
KW - Transformer
UR - http://www.scopus.com/inward/record.url?scp=85214677636&partnerID=8YFLogxK
U2 - 10.1016/j.cosrev.2024.100721
DO - 10.1016/j.cosrev.2024.100721
M3 - 文献综述
AN - SCOPUS:85214677636
SN - 1574-0137
VL - 56
JO - Computer Science Review
JF - Computer Science Review
M1 - 100721
ER -