TY - JOUR
T1 - Stepwise local synthetic pseudo-CT imaging based on anatomical semantic guidance
AU - Sun, Hongfei
AU - Zhang, Kun
AU - Fan, Rongbo
AU - Xiong, Wenjun
AU - Yang, Jianhua
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2019
Y1 - 2019
N2 - In this study, an anatomic semantic guided neural style transfer (ASGNST) algorithm was developed and pseudo-computed tomography (CT) images synthesized in steps. CT images and ultrasound (US) images of 20 cervical cancer patients to be treated were selected. The foreground (FG) and background (BG) regions of the US images were segmented by the region growth method, and three objective functions for content, style, and contour loss were defined. Based on the two types of regions, a local pseudo-CT image synthesis model based on a convolution neural network was established. Then, global 2D pseudo-CT images were obtained using the weighted average fusing algorithm, and the final pseudo-CT images were obtained through 3D reconstruction. US phantom and data of five additional cervical cancer patients were used for prediction. Furthermore, three image synthesis algorithms - global deformation field (GDF), stepwise local deformation field (SLDF), and neural style transfer (NST) - were selected for comparative verification. The pseudo-CT images synthesized by the four algorithms were compared with the ground-truth CT images obtained during treatment. The structural similarity index between the ground-truth CT and pseudo-CT synthesized by the improved algorithm significantly differed from those synthesized by the other three algorithms (tGDF_bg=7.175 , tSLDF_bg=4.513 , tNST_bg=3.228 , tGDF_fg=10.518 , tSLDF_fg=5.522, tNST_fg=2.869, p < 0.05). Further, the mean absolute error and peak signal-to-noise ratio values prove that the pseudo-CT synthesized by the ASGNST algorithm is similar to the ground-truth CT. The improved algorithm can obtain pseudo-CT images with high precision and provides a novel direction for image guidance in cervical cancer brachytherapy.
AB - In this study, an anatomic semantic guided neural style transfer (ASGNST) algorithm was developed and pseudo-computed tomography (CT) images synthesized in steps. CT images and ultrasound (US) images of 20 cervical cancer patients to be treated were selected. The foreground (FG) and background (BG) regions of the US images were segmented by the region growth method, and three objective functions for content, style, and contour loss were defined. Based on the two types of regions, a local pseudo-CT image synthesis model based on a convolution neural network was established. Then, global 2D pseudo-CT images were obtained using the weighted average fusing algorithm, and the final pseudo-CT images were obtained through 3D reconstruction. US phantom and data of five additional cervical cancer patients were used for prediction. Furthermore, three image synthesis algorithms - global deformation field (GDF), stepwise local deformation field (SLDF), and neural style transfer (NST) - were selected for comparative verification. The pseudo-CT images synthesized by the four algorithms were compared with the ground-truth CT images obtained during treatment. The structural similarity index between the ground-truth CT and pseudo-CT synthesized by the improved algorithm significantly differed from those synthesized by the other three algorithms (tGDF_bg=7.175 , tSLDF_bg=4.513 , tNST_bg=3.228 , tGDF_fg=10.518 , tSLDF_fg=5.522, tNST_fg=2.869, p < 0.05). Further, the mean absolute error and peak signal-to-noise ratio values prove that the pseudo-CT synthesized by the ASGNST algorithm is similar to the ground-truth CT. The improved algorithm can obtain pseudo-CT images with high precision and provides a novel direction for image guidance in cervical cancer brachytherapy.
KW - Neural style transfer
KW - pseudo-CT
KW - radiotherapy
KW - ultrasound
UR - http://www.scopus.com/inward/record.url?scp=85077731337&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2019.2953923
DO - 10.1109/ACCESS.2019.2953923
M3 - 文章
AN - SCOPUS:85077731337
SN - 2169-3536
VL - 7
SP - 168428
EP - 168435
JO - IEEE Access
JF - IEEE Access
M1 - 8903296
ER -