TY - JOUR
T1 - TNUnet
T2 - U-shaped phase feature extraction network for phase unwrapping in optical metrology
AU - Zhang, Ziheng
AU - Wang, Xiaoxu
AU - Wang, Yupeng
AU - Wang, Cheng
AU - Lu, Qianbo
N1 - Publisher Copyright:
© 1963-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Due to the bias of convolution operation, current 2D spatial phase unwrapping (PU) methods in optical metrology based on CNN struggle to accurately capture the global context and fail to model long-range capabilities effectively. The Transformer-based PU method suffers from deep degradation due to its over-reliance on stacked layers for information interaction. Most two-step PU methods based on segmentation models tend to restore to the original resolution directly after downsampling and extracting fine features, resulting in significant loss of depth features. The unwrapping accuracy in many optical measurement scenarios has been compromised by these factors. This paper introduces a new U-shaped network called TNUnet to tackle these challenges. TNUnet leverages TransNeXt as the fundamental feature learning component, effectively integrating the strengths of Aggregated Attention and Convolutional GLU. This compensates for the limitations of CNN in capturing long-range dependencies and mitigates the deep degradation of Transformer. TNUnet, a robust phase feature extraction network, is ideal for regression and segmentation models. Extensive experiments demonstrate that our TNUnet achieves state-of-the-art performance on both models. When the wrapped phase is disturbed by noise or discontinuity, the unwrapping accuracy of the regression model-based TNUnet exceeds 92%, and its FLOPs are nearly 67% lower than the most competitive method, U2-Net. The parameters and FLOPs of the segmentation model-based TNUnet are reduced by nearly 88% and 83% compared to TransUNet, respectively, while achieving an ultra-high unwrapping accuracy of 94%. The code is publicly available at https://github.com/zzi-heng/TNUnet_PU..
AB - Due to the bias of convolution operation, current 2D spatial phase unwrapping (PU) methods in optical metrology based on CNN struggle to accurately capture the global context and fail to model long-range capabilities effectively. The Transformer-based PU method suffers from deep degradation due to its over-reliance on stacked layers for information interaction. Most two-step PU methods based on segmentation models tend to restore to the original resolution directly after downsampling and extracting fine features, resulting in significant loss of depth features. The unwrapping accuracy in many optical measurement scenarios has been compromised by these factors. This paper introduces a new U-shaped network called TNUnet to tackle these challenges. TNUnet leverages TransNeXt as the fundamental feature learning component, effectively integrating the strengths of Aggregated Attention and Convolutional GLU. This compensates for the limitations of CNN in capturing long-range dependencies and mitigates the deep degradation of Transformer. TNUnet, a robust phase feature extraction network, is ideal for regression and segmentation models. Extensive experiments demonstrate that our TNUnet achieves state-of-the-art performance on both models. When the wrapped phase is disturbed by noise or discontinuity, the unwrapping accuracy of the regression model-based TNUnet exceeds 92%, and its FLOPs are nearly 67% lower than the most competitive method, U2-Net. The parameters and FLOPs of the segmentation model-based TNUnet are reduced by nearly 88% and 83% compared to TransUNet, respectively, while achieving an ultra-high unwrapping accuracy of 94%. The code is publicly available at https://github.com/zzi-heng/TNUnet_PU..
KW - deep learning
KW - interference fringes
KW - optical measurement
KW - phase surface restoration
KW - regression model
KW - semantic segmentation
KW - Spatial phase unwrapping
KW - TransNeXt block
UR - http://www.scopus.com/inward/record.url?scp=85218879171&partnerID=8YFLogxK
U2 - 10.1109/TIM.2025.3545864
DO - 10.1109/TIM.2025.3545864
M3 - 文章
AN - SCOPUS:85218879171
SN - 0018-9456
JO - IEEE Transactions on Instrumentation and Measurement
JF - IEEE Transactions on Instrumentation and Measurement
ER -