SPRNet: Automatic Fetal Standard Plane Recognition Network for Ultrasound Images

  • Jiajun Liang
  • Rian Huang
  • Peiyao Kong
  • Shengli Li
  • Tianfu WangEmail author
  • Baiying LeiEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11798)


Fetal standard plane recognition is a crucial clinical part of prenatal diagnosis. However, it is also a sophisticated, subjective, and highly empirical process. Thus, there is a huge demand for proposing an effective and precise automatic method to help experienced as well as inexperienced doctors to complete this process, efficiently. In order to satisfy this clinical need, we propose an automatic fetal standard plane recognition network called SPRNet. Specifically, we adopt DenseNet as the basic network of SPRNet and implement data-based partial transfer learning on it by weight-sharing strategy. We then train our network with a task dataset (fetal ultrasound images) and a transferring dataset (placenta ultrasound images) so that our network can discover and learn the potential relationship between these two datasets to improve the performance and avoid overfitting. Finally, we achieve automatic fetal standard plane recognition by utilizing the feature extracted from SPRNet. The experimental results indicate that our network can attain an accuracy of 99.00% and perform better than conventional networks.


Fetal standard plane recognition Data-based partial transfer learning Fetal ultrasound images Placenta ultrasound images 


  1. 1.
    Li, J., Wang, Y., et al.: Automatic fetal head circumference measurement in ultrasound using random forest and fast ellipse fitting. IEEE J. Biomed. Health Inf. 22, 215–223 (2018)CrossRefGoogle Scholar
  2. 2.
    Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition arXiv:1409.1556 (2014)
  3. 3.
    He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)Google Scholar
  4. 4.
    Hu, J., Shen, L., Sun, G.: Squeeze-and-excitation networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7132–7141 (2018)Google Scholar
  5. 5.
    Cai, Y., Sharma, H., Chatelain, P., Noble, J.: SonoEyeNet: standardized fetal ultrasound plane detection informed by eye tracking. In: 2018 IEEE 15th International Symposium on Biomedical Imaging (ISBI 2018), pp. 1475–1478 (2018)Google Scholar
  6. 6.
    Baumgartner, C.F., Kamnitsas, K., Matthew, J., Fletcher, T.P., Smith, S., Koch, L.M.: SonoNet: real-time detection and localisation of fetal standard scan planes in freehand ultrasound. IEEE Trans. Med. Imaging 36, 2204–2215 (2017)CrossRefGoogle Scholar
  7. 7.
    Huang, G., Liu, Z., Van Der Maaten, L., et al.: Densely connected convolutional networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4700–4708 (2017)Google Scholar
  8. 8.
    Wang, L., Wang, Z., Qiao, Y., et al.: Transferring deep object and scene representations for event recognition in still images. Int. J. Comput. Vis. 126(2–4), 390–409 (2018)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Grannum, P.A., Berkowitz, R.L., Hobbins, J.C.: The ultrasonic changes in the maturing placenta and their relation to fetal pulmonic maturity. Am. J. Obstet. Gynecol. 133(8), 915 (1979)CrossRefGoogle Scholar
  10. 10.
    Yosinski, J., Clune, J., Bengio, Y., Lipson, H.: How transferable are features in deep neural networks?. In: Advances in Neural Information Processing Systems, pp. 3320–3328 (2014)Google Scholar
  11. 11.
    Kong, P., Ni, D., Chen, S., Li, S., Wang, T., Lei, B.: Automatic and efficient standard plane recognition in fetal ultrasound images via multi-scale dense networks. In: Melbourne, A., et al. (eds.) PIPPI/DATRA -2018. LNCS, vol. 11076, pp. 160–168. Springer, Cham (2018). Scholar
  12. 12.
    Cai, Y., Sharma, H., Chatelain, P., Noble, J.A.: Multi-task SonoEyeNet: detection of fetal standardized planes assisted by generated sonographer attention maps. In: Frangi, Alejandro F., Schnabel, Julia A., Davatzikos, C., Alberola-López, C., Fichtinger, G. (eds.) MICCAI 2018. LNCS, vol. 11070, pp. 871–879. Springer, Cham (2018). Scholar
  13. 13.
    Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Fei-Fei, L.: Imagenet: a large-scale hierarchical image database. In: 2009 IEEE Conference on Computer Vision and Pattern Recognition, pp. 248–255 (2009)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.School of Biomedical Engineering, National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, Guangdong Key Laboratory for Biomedical Measurements and Ultrasound ImagingShenzhen UniversityShenzhenChina
  2. 2.Department of UltrasoundAffiliated Shenzhen Maternal and Child Healthcare Hospital of Nanfang Medical UniversityShenzhenChina

Personalised recommendations