TY - JOUR
T1 - Estimating CT from MR Abdominal Images Using Novel Generative Adversarial Networks
AU - Qian, Pengjiang
AU - Xu, Ke
AU - Wang, Tingyu
AU - Zheng, Qiankun
AU - Yang, Huan
AU - Baydoun, Atallah
AU - Zhu, Junqing
AU - Traughber, Bryan
AU - Muzic, Raymond F.
N1 - Funding Information:
This work was supported in part by the National Natural Science Foundation of China under Grants 61772241 and 61702225, by the Natural Science Foundation of Jiangsu Province under Grant BK20160187, by the Fundamental Research Funds for the Central Universities under Grant JUSRP51614A, by 2016 Qinglan Project of Jiangsu Province, by 2016 Six Talent Peaks Project of Jiangsu Province, and by the Science and Technology Demonstration Project of Social Development of Wuxi under Grant WX18IVJN002. Research in this publication was also supported by National Cancer Institute of the National Institutes of Health, USA, under award number R01CA196687 (The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health, USA).
PY - 2020/6/1
Y1 - 2020/6/1
N2 - Computed tomography (CT) plays key roles in radiotherapy treatment planning and PET attenuation correction (AC). Magnetic resonance (MR) imaging has better soft tissue contrast than CT and has no ionizing radiation but cannot directly provide information about photon interactions with tissue that is needed for radiation treatment planning and AC. Therefore, estimating synthetic CT (sCT) images from corresponding MR images and obviating CT scanning is of great interest, but can be particularly challenging in the abdomen owing to a range of tissue types and physiologic motion. For this purpose, inspired by deep learning, we design a novel generative adversarial network (GAN) model that organically combines ResNet, U-net, and auxiliary classifier-augmented GAN (RU-ACGAN for short). The significance of our effort is three-fold: 1) The combination of ResNet and U-net, instead of only the U-net which was commonly used in existing conditional GAN, is enlisted to constitute the generative network in RU-ACGAN. This has the potential to generate more accurate CT than existing methods. 2) Adding the classifier to the discriminant network makes the training process of the proposed model more stable, and thereby benefits the robustness of sCT estimation. 3) Owing to the delicate architecture, RU-ACGAN is capable of estimating superior sCT using only a limited quantity of training data. The experimental studies on ten subjects’ MR-CT pair images indicate that the proposed RU-ACGAN model can capture the potential, non-linear matching between the MR and CT images, and thus achieves the better performance for sCT estimation for the abdomen than many other existing methods.
AB - Computed tomography (CT) plays key roles in radiotherapy treatment planning and PET attenuation correction (AC). Magnetic resonance (MR) imaging has better soft tissue contrast than CT and has no ionizing radiation but cannot directly provide information about photon interactions with tissue that is needed for radiation treatment planning and AC. Therefore, estimating synthetic CT (sCT) images from corresponding MR images and obviating CT scanning is of great interest, but can be particularly challenging in the abdomen owing to a range of tissue types and physiologic motion. For this purpose, inspired by deep learning, we design a novel generative adversarial network (GAN) model that organically combines ResNet, U-net, and auxiliary classifier-augmented GAN (RU-ACGAN for short). The significance of our effort is three-fold: 1) The combination of ResNet and U-net, instead of only the U-net which was commonly used in existing conditional GAN, is enlisted to constitute the generative network in RU-ACGAN. This has the potential to generate more accurate CT than existing methods. 2) Adding the classifier to the discriminant network makes the training process of the proposed model more stable, and thereby benefits the robustness of sCT estimation. 3) Owing to the delicate architecture, RU-ACGAN is capable of estimating superior sCT using only a limited quantity of training data. The experimental studies on ten subjects’ MR-CT pair images indicate that the proposed RU-ACGAN model can capture the potential, non-linear matching between the MR and CT images, and thus achieves the better performance for sCT estimation for the abdomen than many other existing methods.
UR - http://www.scopus.com/inward/record.url?scp=85081900435&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85081900435&partnerID=8YFLogxK
U2 - 10.1007/s10723-020-09513-3
DO - 10.1007/s10723-020-09513-3
M3 - Article
AN - SCOPUS:85081900435
VL - 18
SP - 211
EP - 226
JO - Journal of Grid Computing
JF - Journal of Grid Computing
SN - 1570-7873
IS - 2
ER -